If you’re working on a Mac, you may have encountered the problem of trying to use ssh with a .ppk (PuTTY key file). You can easily convert this file to a Mac friendly .pem file.
- Install Homebrew
- Install putty, which also installs puttygen:
brew install putty
- Generate .pem file:
puttygen keyfile.ppk -O private-openssh -o keyfile.pem
- Change permissions:
chmod go-rw keyfile.pem
ssh -i keyfile.pem username@server
In a previous post I explained how to set up a SQL Developer connection with SSH using a .pem key file. In this post I’ll show you how to generate the .pem file and what you have to do on your remote AWS instance in order to get this to work.
In my case, I use an EC2 instance on which my Oracle database is installed. Via the EC2 online dashboard, I created a Key Pair:
After clicking Create the private key file sqldev-key.pem downloaded. Note that you can only get the key file during the creation process. You’ll never be able to re-create the key file again.
I then placed the downloaded file in a directory on my laptop. Using terminal, I then generated the public key using ssh-keygen -y. When asked for a file, I gave it the sqldev-key.pem file name. The command then output the pubic key string, which I copied to my clipboard.
In order to limit access to the instance, I created a user account with no privileges on my database host, through the command line:
$ useradd sqldev $ passwd secret
I then logged in as sqldev and created the file ~/.ssh/authorized_keys, into which I pasted the key from my clipboard. I then changed the permissions of the authorized_keys file:
$ chmod 400 authorized_keys
Now I tested the connection from my laptop:
$ ssh -i sqldev-key.pem sqldev@[aws host ip]
Now that the connection is verified, I setup SQL Developer to use the new key file for my database connection. See SQL Developer SSH Connection.
Interactive reports can be enabled to be accessed through REST. Doing so is pretty easy if you know where to look.
- Make sure the page containing the report is public.
- Give the report a static ID
- Switch the page designer to Component View .
- In the Attributes section set Enable RESTful Access to Yes.
- The access URL will be provided for you. Copy and paste it into a browser to see the report results in JSON format.
This is just a quick & dirty re-blog of a handy article I found.
Here is a great post by Dimitri regarding HTTPS:
I recently had the need to display a color spectrum, depending on a column value, in a report. The colors had to go from red (column value zero) to green (column value 100).
To achieve this I added a formula to my query that would generate an RGB value based on the column value. I then used this value to render a circle of that color, with the column value inside of it.
Here is the sample query for the report. The report has two columns:
VALUE: The value I wish to display.
VALUE_COLOR: The generated RGB color that gets used in the VALUE’s HTML Expression.
select level as value ,'rgb('||round((255 * (100 - level)/100),0) ||','|| round((255 * level)/100,0) ||',0'||')' as value_color from dual connect by level <= 100 order by 1
I used some custom HTML in the Value column’s HTML Expression to render a Font Awesome circle in the color of the generated RGB value, with the column value inside.
That’s it. Have fun with it.
Tags: cloud, ssh
If you have to access databases that can only be reached through SSH connections, you can configure SQL Developer (4.1) to reach them.
In my example, a .pem key file is required to establish the SSH connection.
- In SQL Developer select SSH from the View menu. The opens up the SSH Hosts dock.
- Right click the SSH Hosts node in the dock and select New SSH Host from the context menu.
- Fill out the New SSH Host form
- Name: The name of your connection
- Host: Remote host address
- Port: Should default to 22
- Username: The username of the SSH connection
- Use key file: Browse to the .pem key file needed for this connection
- Add a Local Port Forward: Check this to forward to a local port.
- Name: A name for this port forward
- Host: The same as above
- Port: Database port of the remote host
- Automatically assign local port (selected) or you can choose to use a specific local port if you wish
- Click OK.
A node with your new connection should now appear in the SSH Hosts connections.
Right click the node and select Test from the context menu. A message with a success message should pop up.
Create a new database connection:
- Fill in the connection name, username and password for the remote database.
- For Connection Type choose SSH.
- Select the newly created Port Forward from the select list.
- Click Service Name and fill in the service name of the remote database.
- Test and Save the connection.
That’s it. You should now be able to connect via the SSH Tunnel.
Anton Nielsen provided a great tip for troubleshooting apex_web_service in apex_web_service.make_rest_request not working with POST.
I’ve been struggling to get a RESTful POST request working through APEX. After some time of trial and error I finally got a working example going.
For this test I created a table to store the REST data:
create table rest_data (description varchar2(100), blob_value blob);
From the SQL Workshop menu, select RESTful Services. Click the Create button to create a new service, and configure it by simply giving it a name (cmr) and a URI Template (test/) (don’t forget the trailing slash).
Set the Resource Handler method to POST and Source Type to PL/SQL. Now the tricky part is about accessing the header variables and the payload of the POST request.
Header variables come across as bind variables, and can thus easily be used in PL/SQL. The payload, comes across as a BLOB, and is a little sensitive to handling.
The PL/SQL source then simply inserts into the table.
declare l_blob blob := :body; begin insert into rest_data(description,blob_val) values (:description, l_blob); end;
Note that assigning the :body bind variable is done in the declaration of the code. There is currently some odd behavior that if you assign it any other way, the blob length is zero.
After clicking Create Module, navigate edit the new POST handler. I set the Requires Secure Access to No, so I don’t have to mess with HTTPS.
Now add a parameter for the header variable. I added one for description, by clicking the Create Parameter button. Note that bind variable names are case sensitive.
That’s it. To test the service you can use a browser plugin like Postman.
The URL to test will be something like this: http://yourhost:port/ords/workspace/test/
The body of the request can be any sample JSON string.
Sending this POST request should insert a row into the rest_data table with the description and the JSON BLOB.
Eventually you’ll want to convert the JSON BLOB into a CLOB so you can process it with APEX_JSON. You can use wwv_flow_utilities.blob_to_clob() to easily make that conversion:
declare l_blob blob; l_clob clob; begin select blob_value into l_blob from rest_data; l_clob := wwv_flow_utilities.blob_to_clob(l_blob); apex_json.parse(l_clob); end;
In order to formulate a response, you need to create an OUT parameter that either returns a header variable, or the response body.
Then add PL/SQL to the source to populate the response bind variable:
:response := 'Receipt OK.';
You can also print directly to the response page by using htp.p:
Amazon’s AWS Database Migration Service (DMS) is now in preview. This service allows you to easily migrate your on-premise database to the Amazon cloud.
This is done through a declarative interface, where you specify your source and target endpoints, and either a VPC (virtual private cloud) or VPN connection. The service will then pull your database/schema/tables and instantiate them with data (data filters are optional) on your cloud instance. The setup will only take a few minutes and DMS will handle the rest.
DMS supports for a variety of platforms and can even provide heterogeneous migrations, such as from Oracle to MySQL.
Once a migration is complete, DMS will even handle all the data changes that occurred during the migration and apply them to the target. This makes moving even production databases easier.
Among others, DMS supports Oracle Enterprise, Standard, Standard One for versions 10g, 11g, and 12g for both the source and target.
Rather than using the web interface, DMS can also be used through a command line interface.
If you choose to do a heterogeneous migration, Amazon provides the AWS Schema Conversion Tool, a downloadable tool that helps to convert you database from one platform to another.
This is quite an amazing feat by Amazon. Check it out!