Wednesday, December 16, 2015

Creating a Linux Instance on Amazon AWS

After having created an Oracle database on AWS I thought that there wasn't as much flexibility as having a Linux host - no SQLPlus, I couldn't get APEX ORDS to work, no cron etc.

So I thought I'd create a Linux instance and do it that way. One of our clients is doing this, and when we connect it's just a putty connection to a Linux host - you wouldn't know it was in AWS.

Before you go too far, download 'puttygen.exe' from
you'll need it later.

So, log into the AWS console.

Select 'EC2' at the top of the page.

At the next page, click on 'Launch Instance'

Click on 'Select' against the Red Hat Instance

Make sure the 'General Purpose' one is checked, it's the free one

Click on 'Review and Launch'. If you select 'Next: Configure Instance Details' it goes through options, all of which will cost money, so you probably don't want to.

 Otherwise, click 'Launch'

 You need to create a key pair - I tried to use the option to log in with a username/password but I had problems, so use the drop-down to select 'Create a new key pair'

Give it a name, then click on 'Download Key Pair'

 Save it, and remember where you put it, it's pretty important.

Click on 'Launch Instances'

 A page will show it creating, then you will see this

 Click on the 'View Instances' button

 It should be running

You will see a 'Public DNS' - this is the address you use with Putty. However, the PEM file generated from the 'Create a new key pair' doesn't work with Putty, so that's why you need Puttygen.

Start puttygen.exe

Make sure'SSH-2 RSA' is selected at the bottom

Click on 'Load'

Use the drop-down to select 'All Files'

Locate the pem file you generated and click 'Open'

A couple of windows will open - click 'OK' at this one

Then click on 'Save Private Key'.

Click 'Yes' here:

Save the file as a 'ppk' and call it the same name as the PEM file and click 'Save'

You can now close the puttygen window.

Open Putty, and create a new session.

In the host name field, enter ec2-user@xxxxxx.xxxxx.xxxxx.xxxx

where the xxxx's are the Public DNS from the AWS console

Expand the 'ssh' in the left-hand Window and expand 'Auth'

Browse to the ppk file you generated with Puttygen

And click 'Open'

Click on 'Session' and then on 'Save', giving it a name

Then click on 'Open'.

Click 'Yes' at the security alert screen

You should be logged in, it won't prompt for a password

Note that to do anything as root you need to use 'sudo'

So that’s it, you now have a Red Hat Linux host. I was going to do another document on how to install Oracle, but it’s the same as installing Oracle on any Linux host so just scp the installation files to the host and follow the Oracle documentation.

Monday, November 23, 2015

Oracle on Amazon – Importing data into the Cloud Database

So I had a think about how to do this. I’ve set up a SQLDeveloper connection so thought I could use that, but that would only work for small and simple data transfers because it would probably be slow. 

Obviously the best way would be to use data pump, but I don’t have access to the host because the database is in the cloud, I didn’t set up a host and install Oracle (maybe another time). 

However, there is a data_pump_dir defined in the database: 

SYS   DATA_PUMP_DIR    /rdsdbdata/datapump 

So there must be a way to use it, and surely Amazon have some information on how to migrate data to the cloud. 


They suggest SQLDeveloper for small stuff, but also show how to use data pump. 

I pre-created the tablespace using SQL Developer – since I assumed the database would be using OMF, I just ran ‘create tablespace replicon;’ and let the database do all the hard work. 

Next, grab an export file from the source database –the Amazon document shows how to take an export using DMBS_DATAPUMP, or you can do it the old-fashioned way from the command line. 

Once you have the dump file, you create a database link from the source database to your Amazon cloud database: 

create database link andy_amazon connect to dbasupport identified by password 

Database link created.  

SQL> select table_name from user_tables@andy_amazon;  

no rows selected   

Next, the document says you need to get the file to the Amazon DATA_PUMP_DIR, using DBMS_FILE_TRANSFER and import using impdp. 

source_directory_object => 'DATA_PUMP_DIR',  
source_file_name => 'replicon.dmp',  
destination_directory_object => 'DATA_PUMP_DIR',  
destination_file_name => 'replicon.dmp',  
destination_database => 'ANDY_AMAZON' );  

SQL> @trans_file  PL/SQL procedure successfully completed.   
The documentation shows the impdp command - but doesn't explain that you can't run commands because you don't have host access, you just have a database. Further searching revealed that you have to create an Amazon EC2 instance, install the Oracle client and then use the impdp binary from that.
That's a bit of a pain to be honest, but the easier way is just to use the host you took the export on - although you will need to weigh up the time it would take to create the Amazon EC2 host against the copy of the dumpfile. A 13MB file took about 3 minutes to transfer.

Create a tnsnames.ora entry for the Amazon database on your database host: 

(ADDRESS_LIST =       
(ADDRESS = (PROTOCOL = TCP)(HOST = = 1521))     )     

Test that you can connect: 

[oracle@mgalinux admin]$ sqlplus dbasupport@ANDY_AMAZON  

SQL*Plus: Release Production on Tue Nov 24 07:09:48 2015  
Copyright (c) 1982, 2013, Oracle.  All rights reserved.  

Enter password: 

select host_name from v$instance;  


Pre-create the user and grant any permissions on tablespaces as required. 

Now you should be able to use the data pump import into the Amazon database using the tnsnames entry, assuming you've copied the file there. 

impdp dbasupport@andy_amazon DUMPFILE=replicon.dmp DIRECTORY=DATA_PUMP_DIR full=y 

Import: Release - Production on Tue Nov 24 07:29:09 2015  Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved. Password:  Connected to: Oracle Database 11g Release - 64bit Production Master table "DBASUPPORT"."SYS_IMPORT_FULL_01" successfully loaded/unloaded Starting "DBASUPPORT"."SYS_IMPORT_FULL_01":  dbasupport/********@andy_amazon DUMPFILE=replicon.dmp DIRECTORY=DATA_PUMP_DIR full=y Processing object type SCHEMA_EXPORT/USER ORA-31684: Object type USER:"REPLICON" already exists Processing object type SCHEMA_EXPORT/SYSTEM_GRANT Processing object type SCHEMA_EXPORT/ROLE_GRANT Processing object type SCHEMA_EXPORT/DEFAULT_ROLE Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE Processing object type SCHEMA_EXPORT/TABLE/TABLE Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA . . imported "REPLICON"."REPLICON"                       5.133 MB   34534 rows . . imported "REPLICON"."REPLICON_BACKUP"                5.121 MB   34438 rows . . imported "REPLICON"."LOGO"                           559.1 KB       1 rows . . imported "REPLICON"."CONTACT_DETAILS"                6.882 KB      22 rows . . imported "REPLICON"."PROJECTS"                       14.21 KB     100 rows . . imported "REPLICON"."REPLICON_YEARS"                 5.476 KB       5 rows Job "DBASUPPORT"."SYS_IMPORT_FULL_01" completed at Mon Nov 23 20:38:05 2015 elapsed 0 00:00:09 
 Log into SQLDeveloper and check the tables are there: 
So that was actually quite easy - obviously large amounts of data may be a challenge, but the process works. 
Next time - installing APEX and running an APEX application from the cloud.