Recently I came across working on Amazon Oracle RDS database where I have to migrate the in-place oracle database to Amazon RDS. Here is my learning from that implementation of frequently used queries in Oracle / Amazon RDS.
1. Getting all the tablespace file from database
select tablespace_name, bytes from dba_data_files;
2. Getting long running job status from database
SELECT SID, SERIAL#, OPNAME, CONTEXT, SOFAR,
TOTALWORK,ROUND(SOFAR/TOTALWORK*100,2) "%_COMPLETE"
FROM V$SESSION_LONGOPS
WHERE OPNAME NOT LIKE '%aggregate%'
AND TOTALWORK != 0
AND SOFAR <> TOTALWORK;
3. How to get all the files from a directory?
select * from table (rdsadmin.rds_file_util.listdir(p_directory => 'DATA_PUMP_DIR'));
4. Getting details from import Log files
select * from table (rdsadmin.rds_file_util.read_text_file(
p_directory => 'DATA_PUMP_DIR',
p_filename => 'abc.log'));
5. Removing a file from a directory
exec utl_file.fremove('DATA_PUMP_DIR','IMPORT.LOG');
6. How to kill a job in RDS Oracle
begin
rdsadmin.rdsadmin_util.kill(sid => sid, serial => serial_number);
end;
/
7. Providing Grant issues
GRANT EXEMPT ACCESS POLICY TO "SYSTEM"
GRANT GRANT ANY ROLE TO "SYSTEM" with admin option;
GRANT DEBUG ANY PROCEDURE TO "HR"
GRANT CREATE VIEW TO "HR"
7. Granting Permissions to SYS Objects in AWS RDS
begin
rdsadmin.rdsadmin_util.grant_sys_object(p_obj_name => 'V_$SESSION',p_grantee => 'HR',p_privilege => 'SELECT');
end;
/
8. Exporting a Schema from a database in AWS RDS / Oracle
================================================================
DECLARE
hdnl NUMBER;
BEGIN
hdnl := DBMS_DATAPUMP.OPEN( operation => 'EXPORT', job_mode => 'SCHEMA', job_name=>null);
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'ABC.dmp', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file);
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'ABC.log', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file);
DBMS_DATAPUMP.METADATA_FILTER(hdnl,'SCHEMA_EXPR','IN (''ABC'')');
DBMS_DATAPUMP.START_JOB(hdnl);
END;
/
================================================================
9. Getting number of tables from a database
select count(*) from all_objects where object_type = 'TABLE';
10. How to drop a AQ Tables:
EXEC dbms_aqadm.drop_queue_table ( queue_table => 'HR.AQ$_HR_EMP_SQ_I',force=>true);
create tablespace TEMP1 datafile size 5G autoextend on
maxsize 10G;
16. creating a database in Oracle or Amazon RDS database
Sample Syntax:
create user schema_1 identified by <password>;
grant create session, resource to schema_1;
alter user schema_1 quota 100M on users;
Example:
CREATE USER HR IDENTIFIED BY HR
DEFAULT TABLESPACE
USERS
TEMPORARY TABLESPACE
TEMP
QUOTA UNLIMITED ON
USERS;
GRANT CREATE SESSION, CREATE TABLE to HR;
17. Importing Dump file in Amazon RDS
Now we could start running the import process with the following command:
================================================================
DECLARE
hdnl NUMBER;
BEGIN
hdnl := DBMS_DATAPUMP.OPEN( operation => 'IMPORT', job_mode => 'SCHEMA',job_name=>null);
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'HR.dmp', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file);
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'HR.log', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file);
DBMS_DATAPUMP.METADATA_FILTER(hdnl,'SCHEMA_EXPR','IN (''HR'')');
DBMS_DATAPUMP.START_JOB(hdnl);
END;
/
================================================================
18. How to copy your dump files to S3 bucket from Linux Server