How to Migrate to AWS RDS - Oracle Database Migration To RDS - Step By Step Tutorial
In the video I am demonstrating how we can migrate an on-premises Oracle database to AWS RDS for Oracle database using DataPump export/import. Hope this will help DBAs...
Source Side Commands:
CREATE USER APPUSER
IDENTIFIED BY "appuser12345"
DEFAULT TABLESPACE APP_TBS
TEMPORARY TABLESPACE TEMP
PROFILE DEFAULT
ACCOUNT UNLOCK;
-- 3 Roles for APPUSER
GRANT CONNECT,RESOURCE TO APPUSER;
ALTER USER APPUSER DEFAULT ROLE ALL;
-- 1 System Privilege for APPUSER
GRANT UNLIMITED TABLESPACE TO APPUSER;
--Objects
create table appuser.my_objects as select * from dba_objects;
create table appuser.my_tables as select * from dba_tables;
create table appuser.my_indexes as select * from dba_indexes;
create index appuser.my_objects_ix1 on appuser.my_objects(object_name);
create index appuser.my_tables_ix1 on appuser.my_tables(table_name);
create index appuser.my_indexes_ix1 on appuser.my_indexes(index_name);
create sequence appuser.my_squence start with 1 increment by 1;
select object_name,object_type from dba_objects where owner='APPUSER'
select * from DBA_DIRECTORIES order by directory_name
expdp userid=\'/ as sysdba\' DIRECTORY=DATA_PUMP_DIR DUMPFILE=appuser_onprem.dmp LOGFILE=expdp_appuser_onprem.log SCHEMAS=APPUSER FLASHBACK_TIME=SYSTIMESTAMP
RDS Side Command:
CREATE USER APPUSER
IDENTIFIED BY "appuser12345"
DEFAULT TABLESPACE USERS
TEMPORARY TABLESPACE TEMP
PROFILE DEFAULT
ACCOUNT UNLOCK;
-- 3 Roles for APPUSER
GRANT CONNECT,RESOURCE TO APPUSER;
ALTER USER APPUSER DEFAULT ROLE ALL;
-- 1 System Privilege for APPUSER
GRANT UNLIMITED TABLESPACE TO APPUSER;
select * from dba_users where username='APPUSER'
select object_name,object_type from dba_objects where owner='APPUSER'
select * from DBA_DIRECTORIES order by directory_name
Downloading from S3:
SELECT rdsadmin.rdsadmin_s3_tasks.download_from_s3(
p_bucket_name ="greater than symbol" 'migrationproject',
p_s3_prefix ="greater than symbol" 'dba/appuser_onprem.dmp',
p_directory_name ="greater than symbol" 'DATA_PUMP_DIR')
AS TASK_ID FROM DUAL;
1670703010887-1356 -The Task ID
Reading the Task Logfile:
SELECT text FROM table(RDSADMIN.rds_file_util.read_text_file('BDUMP','dbtask-1670703010887-1356.log'))
Listing the Directory:
select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')) order by mtime desc
Import Job:
DECLARE
v_hdnl NUMBER;
BEGIN
v_hdnl := DBMS_DATAPUMP.OPEN(
operation ="greater than symbol" 'IMPORT',
job_mode ="greater than symbol" 'SCHEMA',
job_name ="greater than symbol" null);
DBMS_DATAPUMP.ADD_FILE(
handle ="greater than symbol" v_hdnl,
filename ="greater than symbol" 'appuser_onprem.dmp',
directory ="greater than symbol" 'DATA_PUMP_DIR',
filetype ="greater than symbol" dbms_datapump.ku$_file_type_dump_file);
DBMS_DATAPUMP.ADD_FILE(
handle ="greater than symbol" v_hdnl,
filename ="greater than symbol" 'impdp_appuser_onprem.log',
directory ="greater than symbol" 'DATA_PUMP_DIR',
filetype ="greater than symbol" dbms_datapump.ku$_file_type_log_file);
DBMS_DATAPUMP.METADATA_FILTER(v_hdnl,'SCHEMA_EXPR','IN (''APPUSER'')');
-- DBMS_DATAPUMP.METADATA_REMAP(v_hdnl,'REMAP_SCHEMA','APPUSER','APPUSER2');
DBMS_DATAPUMP.METADATA_REMAP(v_hdnl,'REMAP_TABLESPACE','APP_TBS','USERS');
-- DBMS_DATAPUMP.SET_PARALLEL(v_hdnl,2);
DBMS_DATAPUMP.START_JOB(v_hdnl);
END;
/
Reading the import log file:
SELECT text FROM table(rdsadmin.rds_file_util.read_text_file('DATA_PUMP_DIR','impdp_appuser_onprem.log'))
Пікірлер: 93
Excellent . To the Pt....
@YouVolve
Ай бұрын
Thanks for your feedback.
Simple , clear and precise to the objective of the demo … thank you
@YouVolve
Жыл бұрын
Thanks for your feedback
Absolutely excellent tutorial - exactly what I was looking for...
@YouVolve
Жыл бұрын
Thanks Rohit for your feedback.
Best ever practical demo explained . Keep it bro ❤❤❤
@YouVolve
Жыл бұрын
Thanks a lot for your feedback.
Thank You so much , great content
@YouVolve
Жыл бұрын
Thanks for watching
Very good explanation
@YouVolve
10 ай бұрын
@user-ig1my8kd4z - Thanks for watching my video and providing your feedback.
Damn it bro i have no words to express my feelings regarding your help Today i completed this project which took me long enough but because of your help i completed my project now i am very happy Much love to you and much respect for you helping me out with fast response on comment session love u bro❤
@YouVolve
10 ай бұрын
Thanks for your feedback. Glad that it helped.😀
Superb...
@YouVolve
Жыл бұрын
Thank you! Cheers!
Great Eductional Video! Have you posted or planning to post a video on how to use AWS Schema Conversion Tool (SCT) or Database Migration Service (DMS) to move data in case the system has zero downtime requirement? Datapump migration will always include some downtime with production system because you have to shutdown the on-premise system, move the data pump file to AWS and then import file and test everything and then switch users to AWS RDS instance. If the datapump files are too big (TB), we have to use snowball service from amazon to send us storage device and ship it to them so they can upload it to S3. This process can take several days.
@YouVolve
Жыл бұрын
Hello Moe, thanks for your interest. I have mentioned all of the options to migrate - expdp/impdp with downtime, AWS DMS with lesser downtime and GG replication without downtime. I will cover these topics on future videos for sure. I have not done my research on SCT yet.
Good tutorial, Great Job. But the background music isn't required
@YouVolve
Жыл бұрын
Thanks Murali for your feedback. In fact, many of the viewers are suggesting that background music is not needed. I am considering about removing the music in my future tutorials.
Great Video! If I want to migrate an oracle 19c datrabase from one source RDS which has 30 schemas to another Oracle RDS in another account, do I need to do one schema at a time or there is a way to export/import all 30 schemas together using DBMS_DATAPUMP?
@YouVolve
Жыл бұрын
Hi Tony, you can export all the 30 schemas together. Once you define the handle (v_hndl in the example below) for the job, you can add the metadata filter as below to include the schemas you want: DBMS_DATAPUMP.METADATA_FILTER(v_hdnl,'SCHEMA_EXPR','IN (''SCHEMA1'',''SCHEMA2'')');
background music for what? it is a technical session and not a movie.
@YouVolve
11 ай бұрын
@user-fl5kl7ci9h, Thanks for watching my videos and the feedback. I have received suggestion of removing background music from many of the viewer and already started implementing it. My most recent videos don’t have the music anymore.
Great Informational Video! My understanding is that RDS can only support one oracle database and one standalone instance. If you have multiple 3 oracle databases/instances running on the on-premise server, do you create 3 RDS instances or do you add the all the databases as separate schemas into the main single RDS instance?
@YouVolve
Жыл бұрын
You may migrate the 3 on-premise databases to 3 individual RDS instances, or 3 PDBs in one container database or 3 individual schemas in a single RDS instance. The 3rd option is cost effective as it will need only one license.
@moedweik9523
Жыл бұрын
@@YouVolve Yes, but does not the 3rd option may cause performance and security problems too because you have all 3 databases using one oracle instance (memory resources and same oracle processes) versus one instance per database on premise? If we size up the RDS to more CPU then we need more licenses. According to AWS too, RDS will only support one container and one PDB (not multiple). You will also have a single point of failure so if the instance fails all 3 databases will be down.
@YouVolve
Жыл бұрын
Hello Moe, you are in the best position to decide on the options... and yes, you are right, I have overlooked the PDB limits currently in RDS. For smaller and logically related databases, consolidation is good as you can still maintain the security and resource allocation using proper grants and resource manager plans.
Can we use Oracle Data Guard between on prem oracle 19c and AWS RDS Instance ? If yes .. any guidance how pls ?
@YouVolve
3 ай бұрын
@anuragbishnoi - Thanks for your watching my video and the question. Oracle Data Guard cannot be setup between an on-premise database and an RDS instance. RDS is pretty much self-managed database and as I explained in my videos, many of the DBA activities, even login with SYS or with SYSTEM users etc are not allowed. However, Data Guard can be setup between an on-prem Oracle database and an Oracle database hosted in an AWS EC2 server. For RDS migration purpose, you may use Data Pump export, Golden Gate replication or AWS DMS etc.
Hi brother i have a doubt can i know how to do the migration for full oracle database
@YouVolve
6 күн бұрын
Hello @dineshnagarajan7633 - You can use the "Full=Y" option to export the full database. However it is a logical copy. If you want to create an exact replica of the source database then you may go for EC2 option in AWS, where you get full server access and can do even RMAN restore.
Can we create a DB link in between both databases, for example, between catdb to rdstest to copy dump file......?
@YouVolve
Жыл бұрын
Hi Chetan, thanks for watching my video and the question. You can definitely create DB links between two RDS DBs or between RDS and on-premise as long as the VPC has been setup and network rules have set properly. But creating an export dump over the network_link has limitation, for example some data types are not supported.
@chetanpatil2612
Жыл бұрын
@@YouVolve got it thanks 😊
Excellent Video! Does this method work for a 100 GB - 200 GB on-premise database or not? I think database pump will compress the data but S3 might have size limit of 5 GB. I also need large local storage on my laptop to download from server and then upload to S3.
@YouVolve
Жыл бұрын
Thanks for your question. Yes it will definitely work. You have to allocate storage as per your requirement and also keep the storage auto-scaling on to a maximum limit that you think reasonable. I have explained this in the my previous video on creating the RDS instance. Also, as I mentioned in this video, it is possible to mount an S3 bucket as a file system to the source server and you can directly upload the dumps to S3 if this arrangement is in place. Downloading to your local storage takes time and also requires large storage so not suggested for large uploads. Up to around 300GB it should be okay.
@moedweik9523
Жыл бұрын
@@YouVolve Thanks for your reply! I assume you need 200 GB of temporary S3 storage to store the Data Pump Files and 200 GB of RDS database storage and I can delete the files from S3 after importing them to RDS. My understanding is that S3 storage have a maximum file size limitation of 5 GB though. How do you work around this file size limitation?
@YouVolve
Жыл бұрын
Sorry for the delayed reply. I have never come across any such file size limitation of 5GB in S3. Also, you can delete unwanted files from RDS host also using the utl_file APIs to free-up space.
The user specified API parameters were of the wrong type or value range. Subsequent messages supplied by DBMS_DATAPUMP.GET_STATUS will further describe the error. *Action: Correct the bad argument and retry the API got this error any help?
@YouVolve
Ай бұрын
There may be a syntactical error or type of any parameter value is wrong. Please share the full command you are using.
Which application you used to connect to your database and run sql commands
@YouVolve
10 ай бұрын
I usually use SQL Developer but SQL Plus or any SQL net client like Toad can also be used.
Hi , background noise is there . please consider this one .
@YouVolve
8 ай бұрын
Hi @user-xr1tl5vd3r, thanks for watching my video and providing your feedback. Yes, for all my recent videos, I no more add background music respecting the suggestions by viewers like you. Since, this is an old one, it is difficult to remove now but going forward my videos will have no background music or noise.
Please create a video of on premises DB migration through the Amazon DMS service and by using the golden gate utility
@YouVolve
Жыл бұрын
Hi Chetan, thanks for your interest.. DMS is in my list to do but right now I am working on a series on OEM and after that it will be performance tuning...so may be couple of months.
Super can you please teach me some Oracle migration steps to AWS . I am unable to find the link for Google drive for Steps
@YouVolve
7 ай бұрын
@souravg1794 - Thanks for watching my video and providing your feedback. All the steps demonstrated in the video are also available in the description of the video. Please let me know if you are unable to copy them.
Does we need to open ports and firewall in order to connect to rds and on prem database ? If yes pls help me to do soo
@YouVolve
10 ай бұрын
@karunberi - yes, if your RDS instance is not open to the internet, then you must open the ports and allow traffic between the RDS and on-prem computers by configuring your network/firewall rules. This is usually done by your organization's network security teams and not by DBAs. Also, you have to create/configure the Security Groups under the AWS account. AWS documentations are very elaborate on this and you should be able to do it by going through the documents.
@karunberi
10 ай бұрын
@@YouVolve i appreciate your comment but can you tell me do i need to do it in this video procedure . If i follow you do i need to open my open prem database ports?? Because i am working on a project for my degree ....i am trying to migrate my linux database to AWS cloud but someone told me i can not do it if i dont open ports which is not a dba work,,,,,,
@YouVolve
10 ай бұрын
For your degree project, you don't have to go through the corporate level security measures. You may create a free AWS account for educational purpose and create an internet accessible RDS instance which, I believe, is accessible for 30 days and has some restriction. Once your RDS instance is accessible, you can migrate your on-prem database to AWS using the method I have demonstrated in the video.
Can we do full database migration using this method if database size is around 25 GB?
@YouVolve
9 ай бұрын
Hi Pankaj, thanks for your question. Yes, you can use this method for offline migration. Smaller databases like 25Gb are good candidates.
@PGExcellenceClasseswecan
9 ай бұрын
@@YouVolve Thanks for help
Just a suggestion, please lower the music volume. It makes difficult to focus.
@YouVolve
Жыл бұрын
Thanks for your honest feedback. I am thinking of completely removing the music in my future videos as I have received multiple such feedback.
Please respond can we connect to oracle database directly with aws glue jdbc connection without migrating the database to cloud?
@anuppande7356
Жыл бұрын
IF yes please explain how?
@YouVolve
Жыл бұрын
Thanks for your question. I am yet to experiment on that but there is a good post on how to do that using a third party JDBC driver in this link: www.cdata.com/kb/tech/oracledb-jdbc-aws-glue.rst
@anuppande7356
Жыл бұрын
@@YouVolve Thank you for your response Please can you provide more information apart from Glue if there is any other service which you have use to make a direct connection to on-premise data source using VPN or something without migrating the data to cloud?
@YouVolve
Жыл бұрын
Hi Anup, as I am yet to do experiments on the subject and upload videos, I won't be able to provide any information on that at the moment. I can assure you that once I do this, I will definitely upload vides and answer relevant questions on that. At this time, I just want to suggest that you go through the technical support documents and articles from your cloud service provider for the specific questions you have. Instead of a generic question, if you search with your specific issue and question, you should be able to find the solution. For example, if your application is hosted in an AWS EC2 instance, it should be pretty much simple to connect to the databases hosted on-premise once your VPC, Gateway, security groups/rules etc. are well defined.
It is very important project for me And now i am getting error in select rdsadmin.rdsadmin It shows that rdsadmin is invalid identifier ,,,, i did all steps of creating rds instance with s3 bucket integration and all but this error......please help
@YouVolve
10 ай бұрын
You have to provide some detail. Exactly what you were performing when this error message came. What is the full error message?
@karunberi
10 ай бұрын
@@YouVolvei was performing the task Select rdsadmin.rdsadmin download from s3 When i ran the command in which we download the file from s3 to rds instance . When i excecuted the command it displayed an error naming "invalid identifier" "rdsadmin.rdsadmin.download_from_s3(p_bucket_name..............the whole command)
@YouVolve
10 ай бұрын
@@karunberi got it...looks like the error message came during the S3 operation. It happens when the S3_INTEGRATION option is not enabled in the Options Group for the RDS instance. Please add that option to the Options Group and the issue should be resolved.
Can i do these all steps in sql developer ??? Pls anwser
@YouVolve
10 ай бұрын
Yes, you can do all the steps that involve execution of SQL or PL/SQL code in SQL developer, SQL Plus or any SQL net client like Toad. Please note that for RDS instances, you won't have access to the Operating System or the host. Also, you cannot login to the DB using SYS or SYSTEM account. For example, you cannot run expdp/impdp commands to export/import data. But you have to use PL/SQL APIs as I have demonstrated in the video.
@karunberi
10 ай бұрын
@@YouVolvei really liked your video and im almost done now but i want to ask how can i make my own plsql for my schema like if i have a schema named Karun which has only 2 tables can i use your plsql code to just import it on my rds by just changing the user name from appuser to karun????
@YouVolve
10 ай бұрын
@@karunberi Thanks for watching my video and the feedback. In my video the APPUSER is just an example, and yes, you may change it to the appropriate schema name in your database.
Very Good Job. But the background music is very irritated
@YouVolve
11 ай бұрын
Thanks Manoj for watching my videos and the feedback. I have received this suggestion of removing background music from many of the viewer and already started implementing it. My most recent videos don’t have the music anymore.
@ManojYadav-lw8tu
11 ай бұрын
@@YouVolve can you provide your no, for class
@YouVolve
11 ай бұрын
Hi Manoj, Thanks for your interest. I don't take classes as such. But I am working on uploading a few Udemy courses on Oracle/MySQL/Aurora, database, OEM, VM, AWS/RDS etc. If you have subscribed to my KZread channel, you will be able to enroll in the courses with discount.
@ManojYadav-lw8tu
11 ай бұрын
@@YouVolve thank you let me check
@YouVolve
11 ай бұрын
@@ManojYadav-lw8tu I have not uploaded the courses to Udemy yet, but will do it soon.
@youvolve. Followed your video for a test case and received this error when I try to dwonlaod the dump file from S3 to RDS directory. Do you think this is related to permissions or something else? The bucket does have a role with a policy for READ bucket permissions. SELECT rdsadmin.rdsadmin_s3_tasks.download_from_s3( p_bucket_name => 'rds-bucket-001', p_s3_prefix => 'oracle-dba/hr_onprem.dmp', p_directory_name => 'DATA_PUMP_DIR') AS TASK_ID FROM DUAL; ORA-00904: "RDSADMIN"."RDSADMIN_S3_TASKS"."DOWNLOAD_FROM_S3": invalid identifier 00904. 00000 - "%s: invalid identifier" *Cause: *Action: Error at Line: 20 Column: 8 aws.amazon.com/premiumsupport/knowledge-center/rds-oracle-s3-integration/
@YouVolve
Жыл бұрын
Please enable the S3_INTEGRATION option for the RDS instance. Once that is enabled, it will then create the required packages like RDSADMIN_S3_TASKS under the RDSADMIN schema. With that package, you will be able to upload and download to and from the S3 and the error won't come. I have mentioned this in the video.
@moedweik9523
Жыл бұрын
@@YouVolveWhich minute you mentioned the S3_INTEGRATION feature because I missed it? I had to do two things to get this working: add a role with a policy to add permissions to RDS to read S3 buckets and add an option group with S3_INTEGRATION feature as I could not edit the default oracle option group. The import worked but I had problems with code compile failure due to missing database links to other schemas I did not import yet. Some materialized views failed too. Is there a way to read the output file similar to traditional operating system file instead of running SQL query as this output is small and difficult to read?
@YouVolve
Жыл бұрын
Hello Moe, I have mentioned about the S3_INTEGRATION option at 3:30 minute mark of the video. It is good to create your custom OPTIONS GROUP for your RDS instances. There is no way to read the files other than using the SQLs as long as they are on the RDS host. You can download them to your local machine if you want to read them through traditional method. However the audit, trace and log file for the instance can be found under the "Logs & Events" tab of the instance and you can view, watch or download them with ease.
@moedweik9523
Жыл бұрын
@@YouVolve How do you download the import schema log file generated from RDS instance to the local machine to read it like any file without using SQL? Do you use an S3 bucket first or download directly using some RDS API or procedure for that?
@YouVolve
Жыл бұрын
To download the log files to your local machine to read it in a traditional way, you have to do exactly opposite of what you do while transferring the dumps to RDS host. So, once your log is generated in the RDS host in a location pointed to by the DATA_PUMP_DIR object, you will first use rdsadmin_s3_tasks.upload_to_s3 package to upload the log to the S3 bucket and then from S3 bucket you have to download that to your local machine.
Is not possible to copy the data using SQL Developer sessions from source DB to destination DB?
@YouVolve
11 ай бұрын
Hi, it depends. Can you please share how you are planning to copy the data using SQL Developer and how big is the database?
@moedweik9523
11 ай бұрын
@@YouVolve I have several databases with multiple schemas with sizes from 200 MB to 5 GB. I was thinking of using SQL developer database copy and selecting Source as one RDS database in one AWS cloud account and destination as another RDS database in another AWS acccount.
@YouVolve
11 ай бұрын
@@moedweik9523 Yes, for small Databases like that you may use the copy database -> schema option in SQL Developer. However, to avoid any data consistency issues, you have to make sure that the tables under the schema being copied are not used/modified during the copy process. Also, after the copy is over, check the copy job log for any issues.
@moedweik9523
11 ай бұрын
@@YouVolve The only problem is the application has like 10-20 schemas. Would it be cumbersome to use schema copy in this case? Can I do expdp/impdp or that would require to generate and then move the export file from one the source RDS to the destination RDS data pump directory?