You are able to request an access token from OCI IAM. Yet, when you issue the subsequent request to your target resource (an ORDS endpoint), you receive the following message (error="invalid_token"):
WWW-Authenticate:Bearerrealm="Oracle REST Data Services",error="invalid_token"
Actions you’ve taken
You’ve done the following in OCI:
Registered an Integrated Application with Oracle Identity and Access Management (IAM)
Created a Primary Audience & Scope
Obtained your Client ID and Client Secret
Configured your networking correctly (or at least have high confidence it’s configured correctly)
Created the JWT Role and Privilege (which should be the same as the OCI Scope name)
And protected your target resource (aka ORDS API)
You’ve placed everything where it should be in your choice of API testing tool (cURL, Postman, Insomnia, etc.).
YET…you still receive this error="invalid_token" message, it is quite possible that you have not made the JWK URL publically accessible in OCI IAM.
Solution
Here is how you can verify and resolve this issue. First, navigate to your domain, then select Settings.
If this Configure client access box is unchecked, it is likely the culprit. Check it, then select Save Changes (the button at the bottom of the screen).
This box must be checked so your application can automatically access a JWK URL (to be used for decoding a JWT) without having to sign in to the OCI tenancy.
Then, once you re-attempt your HTTP request, ORDS will be able to:
Access the JWK URL (which you’ve included when you created your JWT Profile)
Verify the authenticity of your JWT, and
Respond with the results from your resource (ORDS endpoint)
Et voilà! And that’s it, you’re back in business!
To-do list
I think we have some action items, too:
Investigate this error message and see if we can improve the message to the user (we’ve already filed an enhancement request on this)
Update the docs to be more specific on this setting and where to find it (a documentation bug has already been filed for this)
Determine if this is a good candidate for adding to the troubleshooting section of our guide
You have an Oracle Cloud Infrastructure Domain (OCI), or intend to set up a Domain in OCI, so that you may register an "Integrated Application" therein, and use the JSON Web Tokens (JWTs) provided by OCI Identity and Access Management (IAM) to request access to protected ORDS Resources (aka ORDS APIs).
If this describes you, then keep reading!
JWTs with Entra? If you are looking for how to configure ORDS to be used with Microsoft Entra JWTs, then you'll want this tutorial.
Creating a Domain
NOTE: Skip this section if you already have a Domain or know how to set one up!
Assuming you are at the central OCI Console, click the navigation icon at the top of your screen. Then, navigate to Identity & Security and click Create domain.
Make sure you are in the correct Compartment!
NOTE: Make sure you are in your correct Compartment (this will be up to you to decide)! Everything I show here is done within the same /ords Compartment.
In this example, I’ve chosen to create an “External User” Domain type. I haven’t tested with any other Domain types, but I assume they all work similarly.
Double check. Are you in the correct Compartment?
Once you’ve entered all the information (e.g., Domain administrator values and Compartment), click Create Domain. Then, click on the Domain you just created. Next, you’ll need to create and add an “Integrated application.”
Mini-tutorial: Adding a user to your Domain
In this mini-tutorial, I demonstrate how to create a new user. Notice how I am using a personal email account, you can totally do this!
Once you’ve added this user, they recieve a password reset email for this Domain. They must create a new password before logging in.
NOTE: I'll be logging in as this user for the JWT demo.
1. Navigate to your Domain.2. Click Create user.3. Enter in your details, and click Create.
Configure client access
NOTE: This box must be checked so your application can automatically access a JWK URL (to be used for decoding a JWT) without having to sign in to the OCI tenancy.
First, navigate to your domain, then select Settings.
If this Configure client access box is unchecked, check it, then select Save Changes (the button at the bottom of the screen).
That’s it. You’re done!
Adding an Integrated Application to the Domain
Adding the application
From within your Domain dashboard, click the Integrated Applications menu item. The “Add application” modal will appear. Read through the application types and choose the most appropriate one. For this demo, I selected “Confidential Application.”
Continue filling in the modal fields. Make sure you choose a unique name!
I’ve left the URLs section empty for this demo and kept the Display settings checked. The Display settings are options for users when logging into their Domain (i.e., they don’t impact this workflow, but they are nice to have set for later).
PRO-TIP: Even though the Description is optional, I recommend including at least a brief sentence. Trust me, your future self will thank you later! Oh, and this same advice applies to your ORDS Resource Modules too!
Click Next to continue to the Resource server configuration section.
Configuring OAuth2.0
In this section, you’ll configure your Resouce server and Client. First, we’ll create a Scope1.
Resource server configuration
When you configure your Scope, you must include a Primary audience field. Here, I’m just choosing the Domain’s name. The Primary audience needs to be structured exactly like this (with the trailing backslash)! Next, create a Scope (I’m using dot notation, since this will mirror the privilege I’ll create in ORDS, later).
Your fields should look like the above.
When finished, click Add.
Client Configuration
Next, you’ll include details about your client. In this demo, I use Postman as a client application stand-in. For this example I’m using the Authorization code grant type2, along with a Redirect URL.
FYI: Postman has a /callback URL that you can use as a Redirect URL.
Mini-tutorial: How do I find Postman’s Callback URL?
From the Authorization tab, select OAuth 2.0. Then scroll down to the Configure New Token section. There you will find the Callback URL. Copy it. You’ll need it!
INFO: The "Redirect URL" is simply the location that OCI IAM should send you back to, once you have authenticated/retrieved a JWT. If this doesn't make sense, don't worry, you'll see that in practice in a later section.
I’ve chosen “All” for the Token issuance policy. Next, click Finish.
TIP: Click the (?) and you'll see a pop-up with information that can help you decide which option is best for you. For demonstration purposes, "All" is the most generous option.
Issuing a POST request to obtain a JWT
You can send off that initial POST request to obtain the JWT from IAM with your Domain set. You’ll then use the details contained in that JWT to configure ORDS.
Gathering the required values for the request
If your application hasn’t already been activated, do so now.
Next, click the Edit OAuth configuration button and retrieve your Client ID and Client secret. Save it to a clipboard or your environment files (if using this for an application).
You will also need the Domain URL. Navigate to your Domain’s main dashboard (❗️make sure you are still in the correct Compartment). Then copy the Domain URL to your clipboard, or environment file (aka.env file).
Setting up Postman
In Postman, make sure you have selected “Request Headers” in the Add authorization data to field. Also, add the word “Bearer” to the Header Prefix field.
You’ll want to select Authorization Code as the Grant Type. For the Authorization URL and Access Token URL, you will use your Domain URL followed by:
Next, add your Client ID, Client Secret, and Scope. Notice how the Scope in Postman uses the Primary audience and the Scope (if you don’t remember where these came from, review the Resource server configuration section). Select “Send as Basic Auth header” for the Client Authentication field.
Requesting the JWT
This next part is easy. From Postman, scroll down to the bottom of the Authorization window until you see the Get New Access Token button. Click it; a new browser window will appear. Enter the user credentials (I’m using the one I created specifically for this demo), and if any pop-up blocker notifications appear, make sure you “allow pop-ups.”
Once authenticated with IAM, you’ll be redirected back to Postman (remember, this is the Redirect URL you added in the Client Configuration section).
Copy the entire JWT, and then click Use Token. You’ll now see the token in the Current Token field in Postman.
HELP ME! Okay, if the Sign-in and/or Redirect didn't work for you, it might be that you are still signed into another OCI session, with a different user (a user that doesn't belong to this Domain). So, what I've had to do in the past is make sure I'm logged out of all OCI sessions, clear my cookies, close out my browser, and then restart with a new one. This has nothing to do with ORDS, but its what worked for me.
ORDS configuration
In these following sections, you will configure your ORDS JWT Profile and ORDS Privilege (associated with the Scope you’ve created). But first, you’ll need to decode your JWT.
Decoding the JWT
In the past, I’ve used two different tools for decoding these JWTs: JWT.io and JWT.ms. In this demonstration I’m using JWT.io.
Paste your JWT into one of the decoders and copy the following values to your clipboard:
iss
scope
aud
The values you’ll need:
IssuerScopeAudienceMy ADB setup
So as not to make too many assumptions, let me show you my ADB setup. I’ve created a 23ai ADB, in the same Compartment as my Domain. And I’ve created an “ORDSDEMO” user. So, everything I’m doing, in ORDS, from here on out is using that user.
You don’t need any other users up until this point. In fact, you could just use the ADMIN account to do all of this. But I’m choosing to use the ORDSDEMO user to demonstrate how to do this with a non-ADMIN, or non-DBA user. So, when we get to the ORDS PL/SQL procedures (found in the ORDS_METADATA schema), you’ll notice they are the non-ADMIN versions of the packages.
Creating the ORDS JWT profile
Step one: Create your JWT Profile. This process “registers” a JWT profile with ORDS, so when an incoming request comes in, ORDS can “cross-reference” information related to Issuers, Audiences, and Scopes.
The easiest way to do all this is to first navigate to an SQL Worksheet and then select the schema from the Navigator tab. Then, under the database objects drop-down menu, choose Packages.
Click the OAUTH package to expand it, scroll to the bottom of that list, right click the CREATE_JWT_PROFILE PL/SQL procedure, and click Run. A slider will appear.
⚠️ REMINDER: The old OAUTH PL/SQL packages are deprecated and will be officially de-supported October 2025. We still have the old PL/SQL Security packages in our docs, but the new ones are included too. Link.
Enter your details exactly like you see below:
P_ISSUER is https://identity.oraclecloud.com
P_AUDIENCE should be whatever your Primary audience is, with the trailing slash included!
P_JWK_URL
FYI: The P_JWK_URL is a combination of [Your Domain URL] + /admin/v1/SigningCert/jwk (this is the endpoint identified in the OCI IAM Domains API doc). And since this is a /v1 endpoint, I assume there may be more. So, double-check the docs in case of a different/updated version.
❗️STOP You need to either comment out, or completely remove the P_ALLOWED_SKEW and P_ALLOWED_AGE values (in both blocks), along with the trailing comma of the L_P_DESCRIPTION value. For demonstration purposes those to values aren't required. And leaving them as "0" will make it impossible to obtain a JWT. They can be "NULL", but not "0," those are two totally different values!
Then, execute the PL/SQL procedure.
Next, you’ll create an ORDS Privilege (to associate with our protected ORDS Resources).
Creating an ORDS privilege
Aka, the “thing” we are protecting and making accessible with a valid JWT. Navigate to your REST Workshop. From the Security tab, select Privileges.
Create a new Privilege and name it exactly the same as the Scope you created in IAM. DO NOT INCLUDE the Primary audience in this scope name (In fact, you can’t; we won’t let you anyway 🤣)! In the Protected Modules tab, select the Module you want to protect. Either drag it from the Available Modules tab to the Selected Modules tab or use the arrows to move it over. Then click Create.
A peek at the ORDS API I’m using:
This just prints out the Current User, using the ORDS :current_userImplicit parameter.
Accessing your protected Resource (aka ORDS API) with the JWT
Now, you may access this protected resource. Unless you’ve taken an hour to set ORDS up, your token will still be valid (it’s valid for an hour). Otherwise, you’ll need to obtain a new, fresh JWT. Using the target ORDS API (mine is the “Hello :current_user” endpoint) in your Postman request field, click Send. You’ll see two things happen (if you are viewing your Postman Console):
An initial POST request to the Token server, and
A subsequent GET request to your target ORDS API
I should clarify…you might not see that initial POST request because you would have accomplished this before you opened the Postman Console. So, if you’d like to see these requests in real time, do this:
Clear out your cookies
Delete your existing token
Clear out the Console
Open the Console back up
Request a new JWT
Request your ORDS API (your protected Resource)
The End
And that’s it, finished! If you’ve made it this far, congratulations. That was a lot of work. Time for a coffee break!
Follow
And don’t forget to follow, like, subscribe, share, taunt, troll, or stalk me!
There are four different Grant Types in the OAuth 2.0 framework: Authorization Code, Implicit, Resource Owner Password Credentials, and Client Credentials. You can read more about them here. ↩︎
ALERT: This is going to seem extremely out of context! But this post actually loosely relates to the ORDS Pre-hook functions section of our docs. I'm in the process of (1) working on a presentation and (2) updating this section of the docs as well (productivity trifecta for the win!), hence why we are here.
Hypothetical scenario
Hypothetically speaking, let’s say you were interested in learning more about Common Gateway Interface (CGI) Environment variables1, what they are, and how to use ORDS to REST-enable a function to produce these variables. If that is the case, you are in luck, my friend!
What follows is a quick way for you to learn more about these variables (as they relate to the Oracle database) and use ORDS in the process!
An excerpt from another “work in progress”
For this example, we’ll rely on the OWA_UTIL PL/SQL package, specifically the PRINT_CGI_ENV procedure (an HTML utility; one of three utility subprograms in the OWA_UTIL package). First, create a Resource Module and Template. Then, when creating a Handler, choose plsql/block as the Source Type and use the PRINT_CGI_ENV procedure in the Handler code.
Like this:
Begin OWA_UTIL.PRINT_CGI_ENV;End;
I created this Resource Module on my “localhost;” your URI will differ in an Autonomous Database – Always Free account (sign-up here).Remember you can export PL/SQL definitions.In case you want to reproduce the Resource Module.
From there, either copy and paste this Handler’s URI (in the above example, that is https://localhost:8443/ords/ordstest/v1/api) into a new terminal session (if using a tool like a curl), or Postman (or a similar testing tool), or navigate to the URI in a new browser tab or window. You’ll see all the CGI Environment variables that are sent back (in an unauthenticated server response) to you, a client, or an application. Pretty neat trick, eh?
Here is an example of the response from an Autonomous Database – Always Free tenancy:
Here is a curl command response from a development configuration (i.e., A locally installed ORDS instance running in Standalone mode and a 23ai database in a Podman container).
About using the -k --insecure option in this curl command (HINT: to circumvent TLS for development purposes).
As you can see, there is tons of data to work with; something to remember if you want to use CGI Environment variables with your ORDS pre-hook (YOU DO NOT HAVE TO; I’m just showing you an example of one of the countless possibilities!).
Start small
You might want to start small by implementing a security policy using something as simple as the QUERY_STRING variable (e.g., where perhaps your ORDS prehook function calls upon an underlying function or procedure that uses a query string as a parameter). Our pre-hook example does something like this, actually 😀.
Check this out; look what happens when I append ?chris to the end of this URI:
And like magic, the QUERY_STRING CGI Environment variable now has a value assigned to it! See how simple and automatic this is?
Something to think about: even if you don’t care about CGI Environment variables today, I guarantee this will be useful in the future. I bet you’ve been in a position where at least some of this is relevant to you on any given week. So, if nothing else, maybe REST-enable this PRINT_CGI_ENV procedure, so you have it ready whenever you need it!
The end
That’s all for now, folks. This is a quick post that hopefully will come in handy one day 😎. Until next time, keep calm and query on.
Follow
And don’t forget to follow, like, subscribe, share, taunt, troll, or stalk me!
There’s plenty to talk about in this release. However, I’m most excited about the performance improvements, ORDS sample applications, and documentation changes. Read on to get the whole story.
Enhancements
API Performance
REST API responses from either AutoREST or customer based modules could see as much as a 30% improvement in response times.
About a year ago, we introduced (we owe all our progress to Orla though) an internal program to track performance changes/improvements across ORDS APIs quantitatively. I can’t go into too much detail, but here is what I can divulge:
Although we use K6 for our performance testing, we are not promoting its use over any other available performance testing solution. There are other great tools available (e.g., Artillery, JMeter, and, of course, k6).
Testing is performed nightly against a 23ai database (installed in a PDB); we also include APEX in these tests.
For the tests, 250 schemas are created and then populated with various database objects (e.g., Functions, Materialized Views, PLSQL Packages, Sequences, Tables, Triggers, JSON Relational Duality Views, etc.)
These schemas are then absolutely hammered with Virtual Users. Users perform actions such as auto-REST enabling objects, creating custom Resource Modules, creating JSON Relational Duality Views, interrogating ORDS Metadata, and performing bulk inserts (BATCHLOAD) GETs, POSTs, etc.
These metrics are what we use to track the ORDS quantitative metrics longitudinally.
So, that’s what we mean by “performance improvements.” Pretty cool, eh?
NOTE: I don't know if that 30% average is mean or median. So, for all you stat nerds, don't ask 🤣!
ORDS Sample applications
We have not one but TWO sample ORDS applications for you 😍!
Flask/Python
The first is a fully contained LiveLabs sandbox workshop, which can be found here. But if you want to remix the code, check out my repo here (everything is heavily commented; hopefully, this will ease your pain).
Node.js/React
Secondly, our development team has created a brand new advanced application. Details are here.
NOTE: We'll continue to iterate and improve on both, so please share with us your feedback!
OAuth2.0 changes
A consolidation and streamlining of the OAUTH and OAUTH_ADMIN PLSQL packages. The details:
We’ve consolidated those mentioned above into these two new packages:
ORDS_SECURITY
ORDS_SECURITY_ADMIN
The OAUTH and OAUTH_ADMIN PL/SQL Packages have been deprecated by royal decree. However, they’ll still be included until ORDS version 25.3 (this time next year).
Creating a client and receiving your Client ID and Client Secret is now streamlined, and Client Secrets can now be rotated (by supporting two active Client Secrets while in rotation).
Locating the new PL/SQL Packages:
Finding PL/SQL Packages in Database Actions.
23ai Boolean
ORDS now returns BOOLEAN types as JSON TRUE|FALSE properties instead of 0|1.
What this looks like in practice with various configurations1:
Oracle Database 23ai + ORDS 24.3
Oracle Database 23ai + ORDS 24.2
Oracle DB 21c Enterprise Edition + ORDS 24.3
Not possible. #RedHerring
1Thank you internet stranger for providing us with this juicy bit of code.
Mong[ooohhh, no, you didn’t?!] DB API
Support for even more Database Administration commands:
listIndexes
dropIndexes, and
optional parameter expireAfterSeconds (which applies to the createIndexes command)
The following MongoDB Aggregation Stages are now supported:
UPDATE: MongoDB API update article (October 10, 2024)
A brand new article about the latest MongoDB API updates just dropped! Thanks to Hermann for publishing and sharing the latest. Details are here.
Documentation
Introduced the following new sections:
6.2.4 Using OCI Monitoring Service with Oracle REST Data Services
This new section details the configuration of the recently added ords-metrics utility. You can find details on how to set up this monitoring service (to communicate with OCI) here.
3.2 Deploying ORDS with Central Configuration Server
Along with the updated docs, we’ve included the OpenAPI spec for creating the endpoints required for a central configuration server (and a special video clip of me retrieving the PL/SQL definitions and the OpenAPI spec in Database Actions).
In our Release Notes, we claim support for the following JDKs:
Oracle Java 11, 17, or 21
Oracle GraalVM Enterprise Edition for Java 11
Oracle GraalVM Enterprise Edition for Java 17
Oracle GraalVM Enterprise Edition for Java 21
However, this may be confusing regarding Oracle GraalVM Enterprise Editions. You should know that there are currently TWOOracle GraalVM Enterprise Edition JDKs:
Oracle GraalVM Enterprise Edition 20
Oracle GraalVM Enterprise Edition 21
Instead of how we’ve presented, here is another, cleaner presentation of these JDKs:
Oracle GraalVM Enterprise Edition 20
Oracle GraalVM Enterprise Edition 21
Linux (x86-64): Java 8, 11
Linux (x86-64 and aarch64): Java 8, 11, 17
macOS (x86-64): Java 8, 11
macOS (x86-64 only): 8, 11, 17
Windows (x86-64): Java 8, 11
Windows (x86-64 only): 8, 11, 17
Oracle GraalVM Enterprise Edition details
So when you are choosing your JDK (to use with ORDS), make sure you consider your platform and use cases. Details on using GraalVM with ORDS here.
fin
This concludes the release notes supplement.
This space ⬇️ left intentionally blank.
Follow
And don’t forget to follow, like, subscribe, share, taunt, troll, or stalk me!
I’m trying to be as succinct as possible. I just accomplished this but couldn’t find anything that explicitly walked me through how to do it. Here are some assumptions:
You have followed these instructions for Installing Flask on an Ubuntu Virtual Machine (VM) in Oracle Cloud Infrastructure1
You at least have a Free Tier account in OCI
You have an existing GitHub account
You’re already or presently signed into (via SSH presumably) your Linux Virtual Machine (like I am in the images you’ll soon see)
I suppose you don’t need anything related to Flask or Python; you need to follow the steps to the point where you need to git clone something in your Virtual Machine (aka your Compute Instance). ↩︎
REMINDER: You are signed in to a Linux machine! The commands you will follow (in GitHub or here) WILL NEED TO BE FOR LINUX!
What happened?
I am trying to clone an already existing repository in my Virtual Machine.
I received the following fatal: Authentication failed for... error when attempting to authenticate with GitHub from within my Virtual Machine with my GitHub username and password:
Epic failure when trying to sign in using your username and password.
Authenticating with SSH
Since the HTTPS method of authenticating doesn’t work, the next available option is to clone via SSH.
Even if you don’t know what this means, as long as you follow the Linux instructions in the GitHub documentation, you’ll be fine. First, you’ll want to check to see if you have existing SSH keys in your Linux VM (spoiler: you won’t, at least not for GitHub).
Then, add your PRIVATE key (NOT your public key) to your SSH agent (don’t even ask me what the hell an SSH agent is, isn’t, or does). I presume the SSH agent assists in authentication.
Finally, add your Public key to GitHub. They’ve provided instructions here, but I have included some screenshots of what this looks like when done for your OCI VM.
Navigate to settingsNext, click SSH and GPG keysAdd your public SSH key with your email, too!
Once you’ve saved it, you can use it git pull from within your OCI VM. As can be seen here:
And that’s all for now. I mostly wrote this blog out of sheer frustration. Please share it with your friends and bookmark it for future use!
Follow
And don’t forget to follow, like, subscribe, share, taunt, troll, or stalk me!
If you are coming from the previous related post, then you’ll recall I used the following SQL query:
Remember, this SQL is querying the Movie View.
My next step is to take this SQL and bring it to the REST Workshop, where I’ll turn it into an API.
REST Workshop
There are several ways you can navigate to the REST Workshop. Typically, I return to the Database Actions LaunchPad. From there, I select REST.
The Handler code
I've already created my Resource Module, Template, and Handler. I kept everything default, with no authentication enabled.
The only thing I changed was the SQL query. I removed the final line, fetching the first 10 only. I want to be able to control the pagination of the API. If I were to keep that last line, this eventual endpoint would always only return the first 10 rows. And what if I want the next ten rows thereafter? Well, if I hard-code this, then I can’t really make that work. So, I chose to leave it open-ended.
Technically, it is NOT open-ended because I retained the default pagination of 25. But, by removing that fetch first 10 rows condition, I can now fetch ALL rows that fit those parameters (in increments of 25).
If I visit this new endpoint, it will appear like this:
And if I collapse the items, you’ll see something that is EXTREMELY confusing. If I removed that fetch first 10 rows condition in the original SQL query, then why do we see a limit and offset of 10?
The answer is because I actually set the Items Per Page equal to 10 (in the Resource Handler). This is the REST equivalent of a dirty joke. Consider yourself roasted…
JavaScript
With that endpoint live, I can take the API and drop it into some sample JavaScript and HTML code.
JavaScript and HTML
I learned a great deal about this JavaScript by reviewing this YouTube video. That is where I learned how to map through the items of my ORDS payload. And there was a refresher on JavaScript string interpolation (with template literals) too!
PAUSE: Don't be too intimidated by string interpolation and template literals! Read the link I included, and take your time. If you are coming from Python, its similar to Jinja (when using Flask) and f-string literals 🙃.
You can see that I’m using the map() constructor to iterate through all the data in my JSON payload. Remember, this was the payload in the items portion of my endpoint!
I believe the item in list.map((item) is a reference to an individual item inline 4’s data.items. The reason why I think this is because if I change the items in lines 7-10 in my JavaScript to something random, like the name bobby, things start to break:
However, if I change everything back to item, and start the live server in VS Code, I’ll be met with the following rendering:
That’s it, though. Combining the ORDS API, the Fetch API, simple JavaScript, and HTML will allow you to create this straightforward web page.
Reviewing Inspector, Console, Network
I also have a few more screenshots, one each for the HTML Inspector, Console Log, and Client/Server Network. All of these show what is happening under the covers but in different contexts.
Inspector
In the Inspector, you can see how the JavaScript map() constructor plus the document.querySelector() in line 18 of the JavaScript code work in tandem with line 12 of the HTML script to display contents on the page:
Console
Here, you can see the items in the Console. This is because we added console.log(item)in line 19 of the JavaScript code.
Network
Finally, you can see the 200 GET request from our ORDS API. Then, on the far right of the screen, you can see the JSON payload coming from that same ORDS endpoint.
Admittedly, the way the “Cast” is displayed is not correct. That is yet another array of cast members. And I’ve yet to learn how to structure that correctly. So, if you are reading this, and you know, let me know!
In this example, we call this collection Movie_Collection.
-- create and load movie json collection from a public bucket on object storage
begin dbms_cloud.copy_collection ( collection_name => 'MOVIE_COLLECTION',file_uri_list => 'https://objectstorage.us-ashburn-1.oraclecloud.com/n/c4u04/b/moviestream_gold/o/movie/movies.json', format => '{ignoreblanklines:true}'); end; /
👆🏻 This is the code I used to create, copy, and ingest this collection into my Autonomous Database.
README: As far as I know, the link above (the one in the code example) will remain stable for the foreseeable future. I'm pretty sure we use it in one of many of our LiveLabs. A lot of what I'm covering here is actually in Task 7 of this LiveLab.
Notice the “PL/SQL procedure successfully completed.” message.
Create a view
With that JSON collection in place, I can then create (aka I’ll continue stealing this code from the same LiveLab) a View of it using the following SQL code:
/* Create a view over the collection to make queries easy */
create or replace view movie as select json_value(json_document, '$.movie_id' returning number) as movie_id, json_value(json_document, '$.title') as title, json_value(json_document, '$.budget' returning number) as budget, json_value(json_document, '$.list_price' returning number) as list_price, json_value(json_document, '$.gross' returning number) as gross, json_query(json_document, '$.genre' returning varchar2(400)) as genre, json_value(json_document, '$.sku' returning varchar2(30)) as sku, json_value(json_document, '$.year' returning number) as year, json_value(json_document, '$.opening_date' returning date) as opening_date, json_value(json_document, '$.views' returning number) as views, json_query(json_document, '$.cast' returning varchar2(4000)) as cast, json_query(json_document, '$.crew' returning varchar2(4000)) as crew, json_query(json_document, '$.studio' returning varchar2(4000)) as studio, json_value(json_document, '$.main_subject' returning varchar2(400)) as main_subject, json_query(json_document, '$.awards' returning varchar2(4000)) as awards, json_query(json_document, '$.nominations' returning varchar2(4000)) as nominations, json_value(json_document, '$.runtime' returning number) as runtime, json_value(json_document, '$.summary' returning varchar2(10000)) as summary from movie_collection ;
Here is what the code looks like in the SQL Worksheet (a part of Database Actions).
Accomplished in the SQL Worksheet.
With that View created, you could go one step further and query with even more specific SQL. In this case, I’ll query the View but exclude any entries where a movie cast does not exist:
Select title, year, gross, cast from movie Where cast is not null Order By 3 DESC nulls last Fetch first 10 rows only;
Here is the SQL, with the Script Output below:
Notice the output of the executed SQL.
ORDSify it®
With ORDS, we can REST-enable pretty much any database object we want.
I have objects, Greg. Can you REST-enable me?
But after spending a few minutes with this collection, I found the MOVIE View to be the easiest, most sensible object to highlight. It’s a straightforward process, with primarily right-mouse clicks.
From the Navigator Panel, select Views from the list of available database objects.
From the Navigator Tab, select Views from the drop-down menu.
Then, right-click on the Movie View, and select REST, then Enable.
Right-click on the Movie View and select REST then Enable.
A slider will appear, for this example, I’ll keep everything default and click Enable (no authentication either, I’m being lazy).
This slider will appear; since you are auto-REST enabling, you can accept the default settings and click Enable.
A Confirmation notification will appear in the upper right-hand corner of the browser 👍🏻.
You’ll know it worked because you’ll see this confirmation message appear.
Navigate back to the Movie View, and select the cURL command option. Twasn’t there before, tis now!
Go back to the Movie view and select the [new] cURL command option.
Select the GET ALL endpoint, and copy the URI. JUST the URI portion!
From the list of available auto-magically created REST endpoints, copy the GET ALL endpoint.
Open a new browser tab, or window, and navigate to the URI. You’ll see everything in the Movie View now!
That endpoint (URI) in the browser looks like this 👆🏻
Why a view?
Yeah, good question. There is probably a performance improvement with using Views. Based on what I’m finding, I can’t definitively say, but they require zero storage, so that’s at least a savings on some resource somewhere. Additionally, I think the customization is pretty compelling, too. What do I mean by that? Well, allow me to elucidate:
Views can provide a different representation (such as subsets or supersets) of the data that resides within other tables and views. Views are very powerful because they allow you to tailor the presentation of data to different types of users.
In this case, the Movie View returns everything found in the collection. I can subset this even further though; by taking that SQL query and REST-enabling it, too. I will, but in a future post.
For now, I’ve left you with an easy way to REST-enable a View (In this case, based on a JSON Collection) that resides in your Autonomous Database.
If you want to try the LiveLab (which you should, as it’s easy and VERY informative), go here. You’ll need an Always Free OCI account, too (so you can provision an Autonomous Database). You can sign up here.
Oh, and we are entering into the season of Cloud World 2024, so mark your calendars 🤪!
That’s all for now 😘.
Follow
And don’t forget to follow, like, subscribe, share, taunt, troll, or stalk me!
BREAKING NEWS!! APEX packages with ORDS and SQLcl now available in Oracle Linux YUM servers and through ULN
The deets
Here is what you need to know. We are releasing three new APEX (a.k.a. “The Artist formerly known as Oracle Application Express”) packages for the following repositories:
ol7_oci_included
ol8_oci_included
ol9_oci_included
NoARCH RPMs
One more update! These RPMs will be packaged as NoArch. This means you will be free to install these on any CPU architecture (Intel x86_64, AMD, and ARM). As a reminder, OCI Always Free tenancy Compute Instances are available with:
Micro instances (I’m using a VM.Standard.E2.1.Micro; an AMD processor)
TIP: These will all be available through both the publicly available Oracle YUM servers andthe Unbreakable Linux Network (ULN) channels.
Fortunately, I was provided advanced copies of the brand-new RPM files that will be included in their respective oci_included repositories. You’ll soon have access to no less than three flavors of APEX (and their dependencies). YUM!
You’ll soon see the following packages:
Oracle Application Express 23.1
NOTE: In this write-up, and the images below, you'll see me reference this as apex-core.
The Oracle Application Express package contains everything you need to install APEX in your database. We also package it with the SQLcl package to simplify installation.
Oracle Application Express Images 23.1
NOTE: In this write-up, and the images below, you'll see me reference this as apex-images.
This Oracle Application Express Images package includes your APEX images (i.e., you may be familiar with this as these are symlinked to your /i/ virtual directory for Oracle APEX images). This package also contains the ORDS package, making it easier to run the APEX image of your choice.
This package will also automatically add a symbolic link (aka symlink) from your APEX images files to your default ORDS doc_root path.
Oracle Application Express Patch Set Bundle 23.1.3
NOTE: In this write-up, and the images below, you'll see me reference this as apex-patch.
The Oracle Application Express Patch Set Bundle provides a means for you to take the latest APEX patch sets, compare them against your existing base APEX images, overwrite those base files with the latest changes, and then place them into their own /opt/oracle/apex/[latest patch]/images directory. Oh, and you’ll also retain the APEX base images in their own /opt/oracle/apex/[apex base]/images directory too!
And when you’re ready, you can update APEX in your database with the latest patches. The package even symlinks the newest patch set images to your APEX installation’s virtual image directory (i.e., the /i/ directory). And finally, we automatically symlink (as mentioned in the Oracle Application Express Images 23.1 section) this folder to your default ORDS doc_root path.
REMEMBER: If you create a Compute Instance in OCI, these repositories will automatically be accessible to you (if you aren't though, check this resource out to get set up)!
A week ago, I had limited (i.e., zero) experience with the Linux package managers Yellowdog Updater Modified (YUM) and Dandified YUM (DNF). But since then, I’ve tinkered around with Oracle Linux (OL) quite a bit. I even successfully installed Oracle Linux 8 (OL8) on an old Raspberry PI 3 Model B V1.2 (which, if you can believe it, was first released in 2016)!
PAUSE: If you want to see a tutorial of I how I did this, let me know in the comments, or @ me on Twitter/X or Threads.
Exploring OL8 and OCI Compute Instances
If you have an OCI Always Free account, you too can get acquainted with Oracle Linux 8 (OL8), as that is provided when creating a new Compute Instance in your OCI tenancy.
After creating that Instance, you can SSH into it and check out what Oracle Linux ships with. Once in, use one of these commands (depending on the version):
# Command for Oracle Linux 7:
yum list *release-el7
# Command for Oracle Linux 8:
dnf list *release-el8
# Command for Oracle Linux 9:
dnf list *release-el9
And here I am checking the packages and repositories (with the dnf list *release-ol8 command) in my Compute Instance:
Since I first created this Compute Instance, I’ve been tinkering with DNF and the repositories, so your output might look different. But at a minimum, you can expect to see the ol8_oci_included and ol8_baseos_latest repositories.
Oracle Database Tools via YUM or DNF
NOTE: This next section WILL NOT apply if you are working with the Oracle Linux that is provided in your OCI Compute Instance!!!
You won’t find a direct link to our Database Tools repository if you do this independently (i.e., not creating an OCI Compute Instance as I did). That is an easy fix; you can manually configure the repositories.
NOTE: If you are doing this through OCI (as in with a newly-created Compute Instance), you won't need to configure this, it will be included in the ol8_oci_included repository (this applies to OL7 and OL9 too)!
But for manual configuration, here are the direct links for the available Oracle Linux versions (you’ll want to use whichever one corresponds to your Linux version):
Oracle Linux 7 http://yum.oracle.com/repo/OracleLinux/OL7/oracle/software/x86_64
Oracle Linux 8 http://yum.oracle.com/repo/OracleLinux/OL8/oracle/software/x86_64
You can manually configure the repositories like this:
You can then inspect the packages in this repository with the following command:
dnf repository-packages [the repo id] info --all
I can now download our ORDS and SQLcl packages, our two must-haves:
In the above images, you can see these packages are being installed from the ol8_oci_included repository. This is because Compute Instances in OCI automatically have access to the Unbreakable Linux Network (ULN); you can read all about it in the OCI documentation found here.
NOTE: Once again, taking the extra step to configure this additional repository is unnecessary; I'm only demonstrating how to do it if you are working with something other than an OCI Compute Instance.
The big reveal/super secret surprise
If you’ve been working with Linux (and specifically Oracle Linux), then none of this will be much of a surprise to you. And you probably (definitely) know more about this than I do. But what youDON’T know is that we are releasing three new APEX packages for the ol7_oci_included, ol8_oci_included, and ol9_oci_included repositories.
Since I know a guy (thanks, Adrian 🤫), I was provided advanced copies of the brand-new RPM files that will be included in the respective oci_included repositories. You’ll soon have access to three flavors of APEX (and their dependencies):
Oracle Application Express 23.1
The Oracle Application Express package contains everything you need to install APEX in your database. We also package it with the SQLcl package to simplify installation.
Oracle Application Express Images 23.1
The Oracle Application Express Images package includes your APEX images (i.e., you may be familiar with this as these are symlinked to your /i/ virtual directory). This package also contains the ORDS package, making it easier to run the APEX image of your choice.
It will also automatically add a symbolic link (aka symlink) from your APEX image files to your default ORDS doc_root path.
Oracle Application Express Patch Set Bundle 23.1.3
The Oracle Application Express Patch Set Bundle provides a means for you to take the latest APEX patch sets, compare them against your existing base APEX images, overwrite those base files with the latest changes, and then place them into their own /opt/oracle/apex/[latest patch]/images directory. Oh, and you’ll also retain the APEX base images in their own /opt/oracle/apex/[apex base]/images directory too!
And when you’re ready, you can update APEX in your database with the latest patches. The package even symlinks the newest patch set images to your APEX installation’s virtual image directory (i.e., the /i/ directory). And finally, we automatically symlink (as mentioned in the Oracle Application Express Images 23.1 section) this folder to your default ORDS doc_root path.
REMEMBER: If you create a Compute Instance in OCI, these repositories will automatically be accessible to you (if you aren't though, check this resource out to get set up)!
That was a lot of words. But in practice, the following images illustrate what to expect when you issue the dnf install command (for the APEX-CORE or APEX-IMAGES packages).
APEX-CORE
REMINDER: The official package name will be Oracle Application Express 23.1.
APEX-CORE install via a local RPMCompleting the APEX-CORE package installation, revealing helpful installation information
APEX-IMAGES
REMINDER: The official package name will be Oracle Application Express Images 23.1.
Alternatively, you may also decide to install the APEX-IMAGES packages (if you intend on using many versions of APEX along with its ORDS companion):
APEX-IMAGES install via a local RPMCompleting the APEX-IMAGES package installation, revealing helpful installation information
Once these packages are available in the ol7_oci_included, ol8_oci_included, and ol9_oci_included repositories, install commands become much more straightforward. You won’t have to bother with the individual RPM files; you can issue either the yum install or dnf install commands.
NOTE: We switched to DNF for OL8 and later but the commands are symlink'd so I believe either will work. You can read about that here.
The only thing you have to do now is decide which packages you’ll want to install!
Check for updates
This last section is just a reminder. If you haven’t done so yet, check for updates to all your repositories (namely the ol7_oci_included, ol8_oci_included, and ol9_oci_included repositories).
TIP: You can issue the yum --help or dnf --help command to refresh your memory on which "update" command to use.
But once again, I suspect you already know how to do this if you have more than a week’s experience with Oracle Linux, YUM, or DNF. Be on the lookout for updates to the Oracle Linux release notes too!
The plan was to create an ORACLE REST endpoint and then POST a CSV file to that auto-REST enabled table (you can see how I did that here, in section two of my most recent article). But, instead of doing this manually, I wanted to automate this POST request using Apple’s Automator application…
Me…two paragraphs from now
Follow along with the video
The plan
I did it. I went so far down the rabbit hole, I almost didn’t make it back alive. I don’t know when this ridiculous idea popped into my head, but it’s been well over a year. Until now, I either hadn’t had the time or the confidence to really tackle it.
The plan was to create an ORACLE REST endpoint and then POST a CSV file to that auto-REST enabled table (you can see how I did that here, in section two of my most recent article). But, instead of doing this manually, I wanted to automate this POST request using Apple’s Automator application.
The use case I made up was one where a person would need to periodically feed data into a table. The data doesn’t change, nor does the target table. Here is an example of the table I’m using:
The basic structure of the Bank Transfers table
And the DDL, should there be any interest:
CREATE TABLE ADMIN.BANK_TRANSFERS
(TXN_ID NUMBER ,
SRC_ACCT_ID NUMBER ,
DST_ACCT_ID NUMBER ,
DESCRIPTION VARCHAR2 (4000) ,
AMOUNT NUMBER
)
TABLESPACE DATA
LOGGING
;
Once this table was created, I auto-REST enabled the table and retrieved the complete cURL Command for performing a Batch Loadrequest. Remember, we have three examples for cURL Commands now, I chose Bash since I’m on a Mac:
Retrieving the the Batch Load cURL Command
Once I grabbed the cURL Command, I would temporarily save it to a clipboard (e.g. VS Code, TextEdit, etc.). I’d then create a new folder on my desktop.
The newly created ords_curl_post folder
How I actually did it
I’d then search via Spotlight for the Automator application. Once there, I’d choose Folder Action.
Choosing Folder Action for this automation
HEY!! README: I'm going to breeze through this. And it may seem like I am well-aquainted with this application. I am not.I spent hours debugging, reading through old StackExchange forums, and Apple documentation so I could share this with you. There is a ton more work to do. But bottom line, this thing works, and its something that is FREE and accessible for a lot of people. You could have a TON of fun with this stuff, so keep reading!
There’s no easy way to get around this, but to get really good at this, you’ll just need to tinker. Luckily, most of these automation modules are very intuitive. And there is a ton of information online on how to piece them all together.
Automator 🤖
All of these modules are drag-and-drop, so it makes it easy to create an execution path for your Folder Action application. Eventually, I ended up with this (don’t worry, I’ll break it down some, a video is in the works for a more detailed overview):
Complete Folder Action automation for the ORDS Batch Load request
The modules
The modules I’m using are:
Get Specified Finder Items
Get Folder Contents
Run Shell Script (for a zsh shell, the default for this MacBook)
Set Value of Variable
Get Value of Variable
Display Notification
You can see at the very top, that I have to choose a target folder since this is a folder action. I chose the folder I created; ords_curl_post.
Get Specified Finder Items and Get Folder Contents
The first two modules are pretty straightforward. You get the specified finder items (from that specific folder). And then get the contents from that folder (whatever CSV file I drop in there). That will act as a trigger for running the shell script (where the filename/s serve as the input for the cURL Command).
PAUSE: I must confess, I had essentially ZERO experience in shell scripting prior to this, and I got it to work. Its probably not the prettiest, but damn if I'm not stoked that this thing actually does what it is supposed to do.
The only main considerations on this shell script are that you’ll want to stay with zsh and you’ll want to choose “as arguments” in the “Pass input” dropdown menu. Choosing “as arguments” allows you to take that file name and apply it to the For Loop in the shell script. I removed the echo "$f" because all it was doing was printing out the file name (which makes sense since it was the variable in this script).
Choosing “as arguments“
The Shell Script
That cURL Command I copied from earlier looks like this:
I made some modifications though. I made sure Content-Type was text/csv. And then I added some fancy options for additional information (more details on this here, go nuts) when I get a response from the database.
REMINDER: I didn't know how to do this until about 30 mins before I got it to work. I'm emphasizing this because I want to drive home the point that with time and some trial-and-error, you too can get something like this to work!
With my changes, the new cURL Command looks like this:
What a mess…That -w option stands for write-out. When I receive the response from the Batch Load request, I’ll want the following information:
Response Code (e.g. like a 200 or 400)
Total Upload Time
Upload Speed
Upload Size
All of that is completely optional. I just thought it would be neat to show it. Although, as you’ll see in a little bit, Apple notifications has some weird behavior at times so you don’t really get to see all of the output.
I then applied the cURL command to the shell script, (with some slight modifications to the For Loop), and it ended up looking like this:
New shells script with updated cURL command
Here is what the output looked like when I did a test run (with a sample CSV):
Success on the cURL command
Set Value of Variable
All of that output, referred to as “Results”, will then be set as a variable. That variable will be henceforth known as the responseOutput (Fun fact: that is called Camel casing…I learned that like 3-4 months ago). You’ll first need to create the variable, and once you run the folder action, it’ll apply the results to that variable. Like this:
Creating a new variableResults from cURL command applied to variable
Get Value of Variable and Display Notification
Those next two modules simply “GET” that value of the variable/results and then sends that value to the Display Notification module. This section is unremarkable, moving on.
And at this point, I was done. All I needed to do was save the script and then move on to the next step.
Folder Actions Setup
None of this will really work as intended until you perform one final step. I’ll right-click the target folder and select “Folder Actions Setup.” From there a dialog will appear; you’ll want to make sure both the folder and the script are checked.
Selecting Folder Actions SetupDouble checking that everything is enabled
Trying it out
Next, I emptied the folder. Then I dropped in a 5000-row CSV file and let Folder Actions do its thing. This entire process is quick! I’m loving the notification, but the “Show” button simply does not work (I think that is a macOS quirk though). However, when I go back to my Autonomous Database, I can 100% confirm that this ORDS Batch Load worked.
Successful Batch LoadDouble checking the Autonomous Database
Final thoughts
This was relatively easy to do. In total, it took me about 3-4 days of research and trial and error to get this working. There is a lot I do not know about shell scripting. But even with a rudimentary understanding, you too can get this to work.
Next, I’d like to create a dialog window for the notification (the output from the cURL Command). I believe you can do that in AppleScript; I just don’t know how yet.
If you are reading this and can think of anything, please leave a message! If you want to try it out for yourself, I’ve shared the entire workbook on my GitHub repo; which can be found here.
I’ll also be doing an extended video review of this, where I’ll recreate the entire automation from start to finish. Be on the lookout for that too!
Overview and connecting with the python-oracledb library
Part II
Connecting with Oracle REST APIs unauthenticated
Part III
Custom Oracle REST APIs with OAuth2.0 Authorization
Welcome back
I finally had a break in my PM duties to share a small afternoon project [I started a few weeks ago]. I challenged myself to a brief Python coding exercise. I wanted to develop some code that allowed me to connect to my Autonomous Database using either our python-oracledb driver (library) or with Oracle REST Data Services (ORDS).
I undertook this effort as I also wanted to make some comparisons and maybe draw some conclusions from these different approaches.
NOTE: If you don't feel like reading this drivel, you can jump straight to the repository where this code lives. It's all nicely commented and has everything you need to get it to work. You can check that out here.
The test files
Reviewing the code, I’ve created three Python test files. test1.py relies on the python-oracledb library to connect to an Oracle Autonomous database while test2.py and test3.py rely on ORDS (test3.py uses OAuth2.0, but more on that later).
test1.py using the python-oracledb librarytest2.py relies on an unsecured ORDS endpointtest3.py with ORDS, secured with OAuth2
Configuration
Configuration directory
I set up this configuration directory (config_dir) to abstract sensitive information from the test files. My ewallet.pem and tnsnames.ora files live in this config_dir. These are both required for Mutual TLS (mTLS) connection to an Oracle Autonomous database (you can find additional details on mTLS in the docs here).
ewallet.pem and tnsnames.ora files
Other files
OAuth2.0, Test URLs, and Wallet Credential files
Other files include oauth2creds.py, testurls.py, and walletcredentials.py. Depending on the test case, I’ll use some or all of these files (you’ll see that shortly).
NOTE: If not obvious to you, I wouldn't put any sensitive information into a public git repository.
Connecting with python-oracledb
One approach to connecting via your Oracle database is with the python-oracledb driver (library). An Oracle team created this library (people much more experienced and wiser than me), and it makes connecting with Python possible.
FYI: I’m connecting to my Autonomous Database. If you want to try this, refer to the documentation for using this library and the Autonomous database. You can find that here.
The Python code that I came up with to make this work:
#Connecting to an Oracle Autonomous Database using the Python-OracleDB driver.
import oracledb
# A separate python file I created and later import here. It contains my credentials, so as not to show them in this script here.
from walletcredentials import uname, pwd, cdir, wltloc, wltpwd, dsn
# Requires a config directory with ewallet.pem and tnsnames.ora files.
with oracledb.connect(user=uname, password=pwd, dsn=dsn, config_dir=cdir, wallet_location=wltloc, wallet_password=wltpwd) as connection:
with connection.cursor() as cursor:
# SQL statements should not contain a trailing semicolon (“;”) or forward slash (“/”).
sql = """select * from BUSCONFIND where location='ZAF'
order by value ASC """
for r in cursor.execute(sql):
print(r)
In Line 7, you can see how I import the wallet credentials from the walletcredentials.py file. Without that information, this code wouldn’t work. I also import the database username, password, and configuration directory (which includes the ewallet.pem and tnsnames.ora files).
From there, the code is pretty straightforward. However, some library-specific syntax is required (the complete details are in the docs, found here), but aside from that, nothing is too complicated. You’ll see the SQL statement in Lines 16-17; the proper SQL format looks like this:
SELECT * FROM busconfind WHERE location='zaf'
ORDER BY value ASC;
And here is an example of this SQL output in a SQL Worksheet (in Database Actions):
Reviewing the SQL in Database Actions
FYI: This is a Business Confidence Index data-set, in case you were curious (retrieved here).
That SQL allows me to filter on a Location and then return those results in ascending orderaccording to the Value column. When I do this using the python-oracledb driver, I should expect to see the same results.
NOTE: You've probably noticed that the SQL in the python file differs from that seen in the SQL Worksheet. That is because you need to escape the single quotes surrounding ZAF, as well as remove the trailing semi-colon in the SQL statement. Its all in the python-oracledb documentation, you just have to be aware of this.
Once I have all the necessary information in my walletcredentials.py file, I can import that into the test1.py file and execute the code. I chose to run this in an Interactive Window (I’m using VS Code), but you can also do this in your Terminal. In the images (from left to right), you’ll see the test1.py file, then a summary of the output from that SQL query (contained in the test1.py code), and finally, the detailed output (in a text editor).
Executing the Python code in an Interactive WindowSummary output from test1.pyDetailed output from test1.py
Wrap-up
For those that have an existing Free Tier tenancy, this could be a good option for you. Of course, you have to do some light administration. But if you have gone through the steps to create an Autonomous database in your cloud tenancy, you probably know where to look for the tnsnames.ora and other database wallet files.
I’m not a developer, but I think it would be nice to simplify the business logic found in this Python code. Maybe better to abstract it completely. For prototyping an application (perhaps one that isn’t micro services-oriented, this could work) or for data- and business analysts, this could do the trick for you. In fact, the data is returned to you in rows of tuples; so turning this into a CSV or reading it into a data analysis library (such as pandas) should be fairly easy!
Connecting via ORDS: sans OAuth2.0
Auto-REST and cURL
I’m still using the “devuser” (although this may be unnecessary, as any unsecured REST-enabled table would do). I’m using the same table as before; the only change I’ve made is to auto-REST enable the BUSCONFIND table for the test2.py code.
In the following images, I’m retrieving the cURL command for performing a GET request on this table.
NOTE: In a recent ORDS update, we made available different shell variations (this will depend on your OS); I've selected Bash.
From there, I take the URI (learn more on URIs) portion of the cURL command and place it into my browser. Since this table is auto-REST enabled, I’ll only receive 25 rows from this table.
NOTE: The ORDS default pagination is limit = 25.
Getting the cURL command from an already ORDS REST-enabled tableSelecting the GET request for BashGET response in JSONThe raw JSON, pretty printed
The code
And the code for this test2.py looks like this:
# Auto-REST enabled with ORDS; in an Oracle Autonomous Database with query parameters.
import requests
import pprint
# Importing the base URI from this python file.
from testurls import test2_url
# An unprotected endpoint that has been "switched on" with the ORDS Auto-REST enable feature.
# Query parameters can be added/passed to the Base URI for GET-ing more discrete information.
url = (test2_url + '?q={"location":"ZAF","value":{"$gt":100},"$orderby":{"value":"asc"}}}')
# For prototyping an application, in its earlier stages, this could really work. On your front end, you
# expect the user to make certain selections, and you'll still pass those as parameters.
# But here, you do this as a query string. In later stages, you may want to streamline your application
# code by placing all this into a PL/SQL or SQL statement. Thereby separating application
# logic and business logic. You'll see this approach in the test3.py file.
# This works, but you can see how it gets verbose, quick. Its a great jumping-off point.
responsefromadb = requests.get(url)
pprint.pprint(responsefromadb.json())
Lines 8 and 13 are the two areas to focus on in this example. In Line 8 imported my URL from the testurls.py file (again, abstracting it, so it’s not in the main body of the code).
The test2.py and testurls.py files
And then, in Line 13, I appended a query string to the end of that URL. ORDS expects the query parameters to be a JSON object with the following syntax:
[ORDS Endpoint]/?q={"JSON Key": "JSON Value"}
The new, complete query string below requests the same information as was requested in the test1.py example:
This string begins with that same BASE URI for the ORDS endpoint (the auto-REST enabled BUSCONFIND table) and then applies the query string prefix “?q=” followed by the following parameters:
Filter by the location "ZAF"
Limit the search of these locations to values (in the Value column) greater than ($gt) 100
Return these results in ascending order (asc) of the Value column
NOTE: You can manipulate the offsets and limits in the python-oracledb driver too. More info found here. And filtering in queries with ORDS can be found here.
And if I run the test2.py code in the VS Code Interactive Window, I’ll see the following summary output.
Summary output from the response in test2.py
Here is a more detailed view in the VS Code text editor:
Detailed output with helpful links
Wrap-up
A slightly different approach, right? The data is all there, similar to what you saw in the test1.py example. There are a few things to note, though:
The consumer of this ORDS REST API doesn’t need access to the database (i.e. you don’t need to be an admin or have a schema); you can perform GET requests on this URI.
The response body is in JSON (ubiquitous across the web and web applications)
Also, language and framework agnostic (the JSON can be consumed/used widely, and not just with Python)
You are provided a URI for each item (i.e. entry, row, etc.)
No need for SQL; just filter with the JSON query parameters
No business logic in the application code
Needless to say, no ORMs or database modeling is required for this approach
However…security is, ahem…nonexistent. That is a problem and flies in the face of what we recommend in our ORDS Best Practices.
Connecting via ORDS: secured with OAuth2
Note: This is an abbreviated explanation, I'll be posting an expanded write-up on this example post haste!
Since this is what I’m considering “advanced” (it’s not difficult, there are just many pieces) I’m going to keep this section brief. Long story short, I’ll take those query parameters from above and place them into what is referred to as a Resource Handler.
TIME-OUT: Auto-REST enabling a database object (the BUSCONFIND table in this case) is simple in Database Actions. Its a simple left-click > REST-enable. You saw that in the previous example. You are provided an endpoint and you can use the query parameters (i.e. the JSON {key: value} pairs) to access whatever you need from that object.
However, creating a custom ORDS REST endpoint is a little different. First you create a Resource Module, next a (or many) Resource Template/s, and then a (or many) Resource Handler/s. In that Resource Handler, you'll find the related business logic code for that particular HTTP operation (the menu includes: GET, POST, PUT, and DELETE).
The Resource Module
The process of creating a custom ORDS API might be difficult to visualize, so I’ll include the steps I took along with a sample query (in that Resource Handler) to help illustrate.
Creating the Resource Module in the ORDS REST WorkshopCreating the Resource TemplateReviewing the available operations for the Resource TemplateThe newly created Resource GET HandlerPlacing the SQL directly into the Resource HandlerTesting out the code to simulate a GET request using "ZAF" as the locationReviewing the output of that SQL query, in a table format
Chances are you may be the administrator of your Always Free tenancy, so you have full control over this. Other times, you might be provided the REST endpoint. In that case, you may not ever have to worry about these steps. Either way, you can see how we’re simulating (as well as both abstracting and keeping the business logic in the database) the query with this final example (test3.py).
Security
The OAuth 2.0 authorization framework enables a third-party application to obtain limited access to an HTTP service, either on behalf of a resource owner by orchestrating an approval interaction between the resource owner and the HTTP service, or by allowing the third-party application to obtain access on its own behalf.
RFC 6749: The OAuth 2.0 Authorization Framework
I’ll keep this section brief, but I’m protecting this resource through the aid of an ORDS OAuth2.0 client. I’ve created one here:
After creating a client you can use the provided URL for requesting a new Bearer Token
And, as you’ll see shortly, I’ll rely on some Python libraries for requesting an Authorization Token to use with the related Client ID and Client Secret. If you want to nerd out on the OAuth2.0 framework, I challenge you to read this.
test3.py example
NOTE: Remember, I'm keeping this section intentionally brief. It deserves a slightly deeper dive, and class is almost over (so I'm running out of time).
The code for this example:
# Custom ORDS Module in an Oracle Autonomous Database.
import requests
from requests_oauthlib import OAuth2Session
from oauthlib.oauth2 import BackendApplicationClient
import pprint
import json
# Importing the base URI from this python file.
from testurls import test3_url
# A separate python file I created and later import here. It contains my credentials,
# so as not to show them in this script here.
from oauth2creds import token_url, client_id, client_secret
token_url = token_url
client_id = client_id
client_secret = client_secret
client = BackendApplicationClient(client_id=client_id)
oauth = OAuth2Session(client=client)
token = oauth.fetch_token(token_url, client_id=client_id, client_secret=client_secret)
bearer_token = token['access_token']
# Location can be anything from the table. Now, only the single variable needs to be passed. Business logic has been abstracted somewhat; as it now resides within
# ORDS. This could make your application more portable (to other languages and frameworks, since there are fewer idiosyncracies and dependencies):
location = "ZAF"
# print(location)
# ------------------------------------------------------------------------------ #
# In Database Actions, we:
# 1. Create an API Module
# 2. Then create a Resource Template
# 3. Finally, a GET Resource Handler that consists of the code from test1.py:
# select * from BUSCONFIND where location= :id
# order by value ASC
# ------------------------------------------------------------------------------ #
url = (test3_url + location)
# print(url)
responsefromadb = requests.get(url, headers={'Authorization': 'Bearer ' + bearer_token}).json()
# This step isn't necessary; it simply prints out the JSON response object in a more readable format.
pprint.pprint(responsefromadb)
Lines 11 and 16 deserve some attention here. The URL for Line 11 comes from the testurls.py file; seen in the previous example. And the contents from Line 16 come from the oauth2creds.py file. Here are the files, side-by-side:
The test3.py, testurls.py, and oauth2creds.py files
As you can see in the testurls.py file, I’m relying on the test3_url for this example. And the OAuth2.0 information you see comes directly from the OAuth Client I created in Database Actions:
In this image, you can see the Client ID and Client Secret
If I put that all together, I can execute the code in test3.py and “pretty print” the response in my Interactive Window. But first I need to adjust the Resource Handler’s URI (the one I copied and pasted from the “REST Workshop”). It retains the “:id” bind parameter. But the way I have this Python code set up, I need to remove it. It ends up going from this:
With that out of the way, I can run this code and review the output.
Running the test3.py code in the Interactive WindowReviewing the summary output – a JSON arrayReviewing the detailed view of the “items“Scrolling to the bottom of the GET response body to see the available links for additional items
From top-to-bottom, left-to-right you’ll see I first execute the code in the Interactive Window. From there I can review a summary of the response to my GET request. That pretty print library allows us to see the JSON array in a more readable format (one that has indentation and nesting); which you can see in the second image. The third image is a more detailed view of the first half of this response. And I include the final image to highlight the helpful URLs that are included in the response body.
Since I know my limit = 25, and the 'hasMore': True (seen in the output in that third image) exists, I know there are more items. You can adjust the limit and offset in subsequent requests, but I’ll save that for another day.
Wrap-up
You can probably tell, but this is like an expansion of the previous example. But instead of relying on the auto-REST enabling, you are in full control of the Resource Module. And while you don’t need to use OAuth2.0 it’s good practice to use it for database authentication. You can see how the response comes through a little differently, compared to the previous example, but still very similar.
In this example, I did all the work, but that might not be the case for you; much of it might be handled for you. The main thing I like about this example is that we rely on stable and popular Python libraries: requests, requests_oauthlib, and oautlib.
The fact that this is delivered as a JSON object is helpful as well (for the same reasons mentioned in the second example). And finally, I enjoy the fact that you only need to pass a single parameter from your (assumed) presentation layer to your application layer; an example might be a selection from an HTML form or drop-down menu item.
The end
We’re at the end of this fun little exercise. As I mentioned before, I will expand on this third example. There are so many steps, and I think it would be helpful for people to see a more detailed walk-through.
And be on the lookout (BOLO) for a video. There’s no way around this, but a video needs to accompany this post.
And finally, you can find all the code I review in this post in my new “blogs” repository on GitHub. I encourage you to clone, fork, spoon, ladle, knife, etc…