Oracle 19c: Auto-List Partitioning HIGH_VALUE Bugs fixed

New features are one reason to upgrade to a new Oracle release, bug fixes of known bugs are another one. With Oracle 19c, at least two bugs related to Automatic List Partitioning were fixed. I did some tests with Auto-List Partitioning on an Oracle 19.3 database.

Auto-List Partitioning with CHAR Data Type

About a year ago, I was playing around with Auto-List Partitioning in Oracle 12.2. While testing, I discovered by accident that some data dictionary views display wrong information when a partition key of data type CHAR is used. For VARCHAR2, it worked as expected. I described this behaviour in a blog post in February 2018 and opened a service request at My Oracle Support. I repeated the same test in Oracle 18c and 19c (beta) at the beginning of this year, but the issue was still the same.

Now, after installing Oracle 19.3, I’m happy to see that the data dictionary views USER_TAB_PARTITIONS, ALL_TAB_PARTITIONS and DBA_TAB_PARTITIONS return the correct result for HIGH_VALUE, even for data type CHAR. A detailed description can be found in the updated blog post Having Fun with Auto-List Partitioning.

Auto-List Partitioning with RAW Data Type

Another issue was discovered by Connor McDonald about a month ago. In his blog post Raw Partitions?, he explains a test case with an Oracle 18.6 database. Connor discovered that DataPump import fails for tables with Automatic List Partitioning on RAW columns. I read this post during my holidays and decided to test this case as soon as possible. Why that? Although Connor writes “I really can’t think of a reason why you ever want a table with a raw partition key” in his post, I see some use cases for this in Data Vault projects.

Partitioning on RAW Data Types in Data Vault

With Data Vault (a data modelling method for data warehouses), it is common to work with MD5 hash keys. From my point of view, I recommend to store them in Oracle databases as RAW(16) columns, not as CHAR(32), as it is done in many Data Vault projects (see blog post Data Types of Join Columns). One possible partitioning strategy for Data Vault is Hash Partitioning on the join key columns to enable full partition-wise join. I explained this approach in strategy 3 in the Trivadis white paper Data Vault Partitioning Strategies. In this case, the partition key is a RAW column.

After my holidays, I did a quick test of my Data Vault demo schema on an Oracle 12.2 database and noted with relief that DataPump export/import works without problems. So, the bug described by Connor is only related to Auto-List Partitioning. But there, it happens on Oracle 12c and 18c. The following test is done on an Oracle 12.2 database.

First, I create a table with Auto-List Partitioning on a RAW column. Because I am lazy, I just copy a Hub table from my Data Vault demo schema. Of course, it does not make sense to use Auto-List Partitioning on a primary key column, but for this test case, that’s fine. I insert 20 rows from the original table, which creates 20 new partitions automatically for the test table.

 
CREATE TABLE h_beer
    (
      H_Beer_Key    RAW (16) NOT NULL ,
      Beer_Name     VARCHAR2 (40) ,
      Load_Date     DATE ,
      Record_Source VARCHAR2 (4 CHAR)
    )
  PARTITION BY LIST (H_Beer_Key) AUTOMATIC
  (PARTITION p_dummy VALUES (‘00000000000000000000000000000000’));
 
Table H_BEER created.
 
INSERT INTO h_beer
SELECT * FROM craftbeer_dv.h_beer;
 
20 rows inserted.
 
SQL> COMMIT;
 
Commit complete.
 

 

Then I export the table with DataPump and drop it from my test schema. Now, I try to import it again, and the import fails – exactly as described in Connor’s blog post:

 
Import: Release 12.2.0.1.0 – Production on Fri May 31 08:07:25 2019
 
Copyright (c) 1982, 2017, Oracle and/or its affiliates.  All rights reserved.
 
Connected to: Oracle Database 12c Enterprise Edition Release 12.2.0.1.0 – 64bit Production
Master table “SYSTEM”.”SYS_IMPORT_TABLE_01″ successfully loaded/unloaded
Starting “SYSTEM”.”SYS_IMPORT_TABLE_01″:  system/******** directory=data_pump_dir dumpfile=h_beer.dmp tables=dani.h_beer
Processing object type TABLE_EXPORT/TABLE/TABLE
ORA-39083: Object type TABLE:”DANI”.”H_BEER” failed to create with error:
ORA-14308: partition bound element must be one of: string, datetime or interval literal, number, or NULL
 

 

The reason for the failure is actually not DataPump, but the way how the HIGH_VALUE column is stored in the data dictionary. The value contains a HEXTORAW function call, not only the hash value. This stored values are the root cause for the import failure.

 
SELECT table_name, partition_name, high_value
  FROM user_tab_partitions
 WHERE table_name = ‘H_BEER’;
 
TABLE_NAME PARTITION_NAME  HIGH_VALUE                                        
———- ————— ————————————————–
H_BEER     P_DUMMY         ‘00000000000000000000000000000000’                
H_BEER     SYS_P21860      HEXTORAW(‘CFBD1173837FB38F81BB06CF34AEB77F’)      
H_BEER     SYS_P21861      HEXTORAW(‘A61723924FFC1C8AAD031341B9041EE3’)      
H_BEER     SYS_P21862      HEXTORAW(‘5A4CEB285627C32034D42FBD45AED530’)      
H_BEER     SYS_P21863      HEXTORAW(‘B33B0848CBFEFC17EF025818F510B449’)      
H_BEER     SYS_P21864      HEXTORAW(‘658A2E4346956BE69EBA564CEF3C0BF8’)      
H_BEER     SYS_P21865      HEXTORAW(’99CED521EBF96C59FC6F225BAF8E9307′)      
H_BEER     SYS_P21866      HEXTORAW(‘A9AD7C66CB3003182EEC0DD17C5F2412’)      
H_BEER     SYS_P21867      HEXTORAW(‘70786EEA2914EDFF39673A05C0FD3CA7’)      
H_BEER     SYS_P21868      HEXTORAW(’59C8E50EA8656B4E040ACD162CFD123A’)      
H_BEER     SYS_P21869      HEXTORAW(’58D1B3149348D7F855A42CC595D9D6CD’)      
H_BEER     SYS_P21870      HEXTORAW(‘B2807CD3BCC39D0284E451F004849F4F’)      
H_BEER     SYS_P21871      HEXTORAW(‘5549344F424320560F07453E3AEFA20E’)      
H_BEER     SYS_P21872      HEXTORAW(‘BE61CD8A3A9D6B2F754EF26C75E4B6F5’)      
H_BEER     SYS_P21873      HEXTORAW(‘9B151A4EB8728165A5D030F3BB7FA202’)      
H_BEER     SYS_P21874      HEXTORAW(‘F724EC0F3C4D9EC12D3137C07492FDF4’)      
H_BEER     SYS_P21875      HEXTORAW(‘B0307BFAFFE8767596AA5897D2119344’)      
H_BEER     SYS_P21876      HEXTORAW(‘2F66FEE82C3AA40850F58E16979E737C’)      
H_BEER     SYS_P21877      HEXTORAW(‘04793F0BA33878031AB48954EB46E7CC’)      
H_BEER     SYS_P21878      HEXTORAW(‘330F532959C43CCADF29B4701F89C98D’)      
H_BEER     SYS_P21879      HEXTORAW(‘4599D74824EE82B205643BA9FBB48C1C’)      
 
21 rows selected.
 

 

When I try to import the same file into an Oracle 19.3 database with DataPump, it fails with the same ORA-14308 error. But the good news is: If I do the complete test again on Oracle 19.3, it works without any problems.

Same Test on Oracle 19.3

Since a few days, I’m able to work with a brand new Oracle 19.3 database (thanks to Stefan Oehrli for the Docker configuration files).

So, let’s do the same steps again: Creating an auto-list partitioned table H_BEER, load it with 20 rows and check the 20 automatic created partitions in the data dictionary. As you can see, the HIGH_VALUE format in Oracle 19.3 is now different, no HEXTORAW call is stored in the column:

 
SELECT table_name, partition_name, high_value
  FROM user_tab_partitions
 WHERE table_name = ‘H_BEER’;
 
TABLE_NAME PARTITION_NAME  HIGH_VALUE                                        
———- ————— ————————————————–
H_BEER     P_DUMMY         ‘00000000000000000000000000000000’                
H_BEER     SYS_P492        ‘CFBD1173837FB38F81BB06CF34AEB77F’                
H_BEER     SYS_P493        ‘A61723924FFC1C8AAD031341B9041EE3’                
H_BEER     SYS_P494        ‘5A4CEB285627C32034D42FBD45AED530’                
H_BEER     SYS_P495        ‘B33B0848CBFEFC17EF025818F510B449’                
H_BEER     SYS_P496        ‘658A2E4346956BE69EBA564CEF3C0BF8’                
H_BEER     SYS_P497        ’99CED521EBF96C59FC6F225BAF8E9307′                
H_BEER     SYS_P498        ‘A9AD7C66CB3003182EEC0DD17C5F2412’                
H_BEER     SYS_P499        ‘70786EEA2914EDFF39673A05C0FD3CA7’                
H_BEER     SYS_P500        ’59C8E50EA8656B4E040ACD162CFD123A’                
H_BEER     SYS_P501        ’58D1B3149348D7F855A42CC595D9D6CD’                
H_BEER     SYS_P502        ‘B2807CD3BCC39D0284E451F004849F4F’                
H_BEER     SYS_P503        ‘5549344F424320560F07453E3AEFA20E’                
H_BEER     SYS_P504        ‘BE61CD8A3A9D6B2F754EF26C75E4B6F5’                
H_BEER     SYS_P505        ‘9B151A4EB8728165A5D030F3BB7FA202’                
H_BEER     SYS_P506        ‘F724EC0F3C4D9EC12D3137C07492FDF4’                
H_BEER     SYS_P507        ‘B0307BFAFFE8767596AA5897D2119344’                
H_BEER     SYS_P508        ‘2F66FEE82C3AA40850F58E16979E737C’                
H_BEER     SYS_P509        ‘04793F0BA33878031AB48954EB46E7CC’                
H_BEER     SYS_P510        ‘330F532959C43CCADF29B4701F89C98D’                
H_BEER     SYS_P511        ‘4599D74824EE82B205643BA9FBB48C1C’                
 
21 rows selected.
 

 

In the next step, I do a DataPump export from the 19.3 database:

 
expdp system/manager directory=data_pump_dir dumpfile=h_beer.dmp tables=dani.h_beer
 
Export: Release 19.0.0.0.0 – Production on Fri May 31 08:30:43 2019
Version 19.3.0.0.0
 
Copyright (c) 1982, 2019, Oracle and/or its affiliates.  All rights reserved.
 
Connected to: Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 – Production
Starting “SYSTEM”.”SYS_EXPORT_TABLE_01″:  system/******** directory=data_pump_dir dumpfile=h_beer.dmp tables=dani.h_beer
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER
Processing object type TABLE_EXPORT/TABLE/TABLE
. . exported “DANI”.”H_BEER”:”P_DUMMY”                       0 KB       0 rows
. . exported “DANI”.”H_BEER”:”SYS_P492″                  6.382 KB       1 rows
. . exported “DANI”.”H_BEER”:”SYS_P493″                  6.382 KB       1 rows
. . exported “DANI”.”H_BEER”:”SYS_P494″                  6.382 KB       1 rows
. . exported “DANI”.”H_BEER”:”SYS_P495″                  6.382 KB       1 rows
. . exported “DANI”.”H_BEER”:”SYS_P496″                  6.382 KB       1 rows
. . exported “DANI”.”H_BEER”:”SYS_P497″                  6.382 KB       1 rows
. . exported “DANI”.”H_BEER”:”SYS_P498″                  6.382 KB       1 rows
. . exported “DANI”.”H_BEER”:”SYS_P499″                  6.382 KB       1 rows
. . exported “DANI”.”H_BEER”:”SYS_P500″                  6.382 KB       1 rows
. . exported “DANI”.”H_BEER”:”SYS_P501″                  6.390 KB       1 rows
. . exported “DANI”.”H_BEER”:”SYS_P502″                  6.390 KB       1 rows
. . exported “DANI”.”H_BEER”:”SYS_P503″                  6.390 KB       1 rows
. . exported “DANI”.”H_BEER”:”SYS_P504″                  6.390 KB       1 rows
. . exported “DANI”.”H_BEER”:”SYS_P505″                  6.390 KB       1 rows
. . exported “DANI”.”H_BEER”:”SYS_P506″                  6.390 KB       1 rows
. . exported “DANI”.”H_BEER”:”SYS_P507″                  6.390 KB       1 rows
. . exported “DANI”.”H_BEER”:”SYS_P508″                  6.390 KB       1 rows
. . exported “DANI”.”H_BEER”:”SYS_P509″                  6.390 KB       1 rows
. . exported “DANI”.”H_BEER”:”SYS_P510″                  6.390 KB       1 rows
. . exported “DANI”.”H_BEER”:”SYS_P511″                  6.390 KB       1 rows
Master table “SYSTEM”.”SYS_EXPORT_TABLE_01″ successfully loaded/unloaded
******************************************************************************
Dump file set for SYSTEM.SYS_EXPORT_TABLE_01 is:
 /u01/import/h_beer.dmp
Job “SYSTEM”.”SYS_EXPORT_TABLE_01″ successfully completed at Fri May 31 08:31:08 2019 elapsed 0 00:00:23

 

 

After dropping the test table, I start the DataPump import. As you can see, it finishes successfully, the table H_BEER with the 21 partitions is recreated from the DataPump file without failure.

 
impdp system/manager directory=data_pump_dir dumpfile=h_beer.dmp tables=dani.h_beer
 
Import: Release 19.0.0.0.0 – Production on Fri May 31 08:33:38 2019
Version 19.3.0.0.0
 
Copyright (c) 1982, 2019, Oracle and/or its affiliates.  All rights reserved.
 
Connected to: Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 – Production
Master table “SYSTEM”.”SYS_IMPORT_TABLE_01″ successfully loaded/unloaded
Starting “SYSTEM”.”SYS_IMPORT_TABLE_01″:  system/******** directory=data_pump_dir dumpfile=h_beer.dmp tables=dani.h_beer
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported “DANI”.”H_BEER”:”P_DUMMY”                       0 KB       0 rows
. . imported “DANI”.”H_BEER”:”SYS_P492″                  6.382 KB       1 rows
. . imported “DANI”.”H_BEER”:”SYS_P493″                  6.382 KB       1 rows
. . imported “DANI”.”H_BEER”:”SYS_P494″                  6.382 KB       1 rows
. . imported “DANI”.”H_BEER”:”SYS_P495″                  6.382 KB       1 rows
. . imported “DANI”.”H_BEER”:”SYS_P496″                  6.382 KB       1 rows
. . imported “DANI”.”H_BEER”:”SYS_P497″                  6.382 KB       1 rows
. . imported “DANI”.”H_BEER”:”SYS_P498″                  6.382 KB       1 rows
. . imported “DANI”.”H_BEER”:”SYS_P499″                  6.382 KB       1 rows
. . imported “DANI”.”H_BEER”:”SYS_P500″                  6.382 KB       1 rows
. . imported “DANI”.”H_BEER”:”SYS_P501″                  6.390 KB       1 rows
. . imported “DANI”.”H_BEER”:”SYS_P502″                  6.390 KB       1 rows
. . imported “DANI”.”H_BEER”:”SYS_P503″                  6.390 KB       1 rows
. . imported “DANI”.”H_BEER”:”SYS_P504″                  6.390 KB       1 rows
. . imported “DANI”.”H_BEER”:”SYS_P505″                  6.390 KB       1 rows
. . imported “DANI”.”H_BEER”:”SYS_P506″                  6.390 KB       1 rows
. . imported “DANI”.”H_BEER”:”SYS_P507″                  6.390 KB       1 rows
. . imported “DANI”.”H_BEER”:”SYS_P508″                  6.390 KB       1 rows
. . imported “DANI”.”H_BEER”:”SYS_P509″                  6.390 KB       1 rows
. . imported “DANI”.”H_BEER”:”SYS_P510″                  6.390 KB       1 rows
. . imported “DANI”.”H_BEER”:”SYS_P511″                  6.390 KB       1 rows
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER
Job “SYSTEM”.”SYS_IMPORT_TABLE_01″ successfully completed at Fri May 31 08:34:02 2019 elapsed 0 00:00:22
 

Leave a comment