문제

I need to run the below DDL:

ALTER TABLE storage_archive MODIFY   duration   char(7);

Currently the column is char(5) and there's nearly a trillion records in the table.

Will this DDL statement take more time if there are more records in the database, or is it unrelated to the number of records in the table? As I'm executing DDL rather than DML (Update or Delete) my understanding is that it should not be depend on number of records in the table.

Am I wrong on this point?

올바른 솔루션이 없습니다

다른 팁

The answer depends on what kind of ALTER you execute. If your ALTER changes just Oracle metadata (for example you add nullable columns to table or add constraint in novalidate mode or set columns unused) you don't depend on user data volume. If you move your table or drop columns or shrink table (do something what requires the request or manipulation data) you depends on user data volume. You need to consult with Oracle documentation and understand what Oracle does in particular case.

Your particular case will touch data in blocks because Oracle needs to add 2 trailing spaces to existing data and execution time can be huge. This is one of many reasons to use varchar2 in favor of char:

SQL> create table t (x char(10));

Time spent: 00:00:00.04
SQL> insert /*+ append */ into t select 'x' from dual connect by level <= 1000000;

Time spent: 00:00:01.17
SQL> commit;

Time spent: 00:00:00.04
SQL> alter table t modify (x char(20));

Time spent: 00:00:52.85
SQL> drop table t;

Time spent: 00:00:02.54
SQL> create table t (x varchar2(10));

Time spent: 00:00:00.07
SQL> insert /*+ append */ into t select 'x' from dual connect by level <= 1000000;

Time spent: 00:00:01.27
SQL> commit;

Time spent: 00:00:00.05
SQL> alter table t modify (x varchar2(20));

Time spent: 00:00:00.07
라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top