I’m kidding, there’s more. But if you don’t feel like reading anymore, bookmark this post and return when you’re ready.
Assuming you have ORDS installed, you too can execute the ords config list --include-defaults command to reveal almost all the configuration settings for your ORDS installation.
Here is what my configuration looks like:
Executing the ords config list --include-defaults command.
Configuration settings: what am I seeing?
This command is a quick way to see all the settings from your .XML configuration files (i.e., the settings.xml and pool.xml files), including other settings automatically configured for you when you first ran the ords interactive installer.
In short: All your default settings and any that you may have added or changed are on one screen.
Read on to explore further…
Version, config folder location, and pool information
I've broken the configuration settings into sections. Anywhere you see red arrows, are just areas of intrigue (personally); however THIS LIST IS NOT EXHAUSTIVE.
I use this first section as an easy, convenient way to determine the ORDS version I’m running. Additionally, you can verify the location of your configuration folder (in case you forget). You can also verify the database pool (default is the default name for the pool unless you modify the name) you are using.
ORDS version, configuration folder location, and database pool information.
If you want to learn more about the ORDS pools, visit this link and this link.
๐กI only have single install, that is why I see default as the default database pool.
Pool and global settings
Not much here that you probably already don’t know. However, in the futureย I will look at the features associated with the database.api.management.services.disabled = false property (also, I think the way this is written is a referred to as a โlogical negationโ, and it hurts my brain to read).
General ORDS settings.
Read more about this service here. But in short (and once you’ve created the requisite user), you can explore various services such as:
DBCA Jobs, available methods: DELETE, GET and POST
DBCA Templates GET
Oracle Home Environment GET
PDB Lifecycle DELETE, GET, POST
Open Service Broker DELETE, GET, and PUT
Debug and Error
My settings are false (these are the default settings).ย Butย if I were to, for instance, set debug.printDebugToScreen = true, I would then be able to see any error messages in the browser.
ORDS debug and error settings.
I can change the responseFormat to always display as JSON, HTML, or AUTO (i.e., Automatically determine the most appropriate format).
โน๏ธ Note: Must explore this further and report back after I've sufficiently tinkered.
Did you know you can create custom HTTP error pages in ORDS? These two error.properties appear to be associated in some way. So if you were two create custom error pages, you’d probably need to consider the format as well. Nonetheless, could you imagine the fun you could have coming up with something totally unique to your application?
I'm definitely adding this to my "Productive Procrastination" list ๐คฃ!
GraphQL and SQL Developer Web
ORDS supports GraphQL now; did you know?! I just set up my local installation (it wasn’t too bad once I figured out how to properlyย set my Java to GraalVMย ๐), so I can start learning GraphQL queries.
GraphQL and ORDS settings.
Did you know ORDS ships with the GraphiQL IDE now? Learn how to set it up here.
Cookies and ICAP
I honestly wouldn’t have known ORDS could offload virus scanning to ICAP (Internet Content Adaptation Protocol) servers unless I looked atย what was actually inย the configuration settings. I’m not sure if I’ll configure this anytime soon, but maybe you will.
Want to bore yourself? Read more about ICAP in this RFC 3507 memo.
Java Database Connectivity (JDBC)
I am NOT going to spend much time here. I still need toggle these parameters and experiment more. However, I will point out that the default setting for maximum JDBC connections is 20 (jdbc.MaxLimit setting).
Java Database Connectivity and ORDS settings.
jdbc.MaxLimit=20 is probably too low for a production environment. I’ve left it as-is because it’s just me, and I’m doing everything locally in my Podman container.
Suppose you need to familiarize yourself with JDBC or Universal Connection Pools (UCPs)? In that case, we shouldย bothย read the introduction sections of the following guides:
I have spent little time with MongoDB, but from what I understand, the Oracle Database API for MongoDB translates the MongoDB wire protocol into SQL statements executed by the Oracle.
ORDS and the MongoDB API settings.
What I’m inferring from our docs is that once you’ve migrated your data from a MongoDB into a supported Oracle database, you (or your application) can keep talking “MongoDB speak,” and at least in this case, ORDS will be able to interpret this Mongospeak and query the database on your behalf ๐คฏ!
If this describes you or your use-case, you’re in luck; I found some excellent resources!
You’ll notice, no red arrows here. I have yet to spend much time with this section. However, I want to draw your attention to the security.jwks.[etc...] and security.jwt.[etc...] properties.
ORDS security settings.
In ORDS 23.3, we introduced JSON Web Tokens (JWTs) support, so these properties very much concern that new functionality.
In short, we've allowed you to incorporate JWT authentication provided through third parties into your APIs.
The nice thing about ORDS is that you can use the embedded Jetty server as a local web server for testing. This section shows most of the essential settings for running Jetty in “Standalone mode.”
ORDS standalone Jetty Server settings.
I use the term “testing” because our docs state, “the default configuration of Jetty is optimized for the most common ORDS use cases.”I interpret this as, “This is designed to expose you to Jetty (and make it easy to get you up and running), but you’ll probably need to adjust this according to your own requirements.”
The only things I want to point out here are the standalone.doc.root and standalone.static.context.path properties. These settings will look familiar if you’ve ever performed an APEX installation (available here, for free, BTW).
However, if you want to deploy custom HTML, CSS, and image files, you can configure this for ORDS. We have an overview in our docs here.
I think I've just stumbled upon another fun Friday afternoon project ๐!
Okay, that’s it for now. Thank you for choosing to waste your time with me.
What’s the point of this post?
There was no point to this post. I’m constantly wasting time researching technology and techniques I don’t need to know. However, in this case, I’ve hopefully:
left you with at least one helpful ORDS command-line command (ords config list --include-defaults), and
provided you with some helpful explanations and resources on what is contained in your ORDS installation (again, this list is NOTexhaustive)
And if you found this post helpful, please share it!
Follow
And don’t forget to follow, like, subscribe, share, taunt, troll, or stalk me!
I found JavaScript and HTML code here and here and “remixed” it to work with one of my sample ORDS APIs. Here is the result:
ORDS + JavaScript + Fetch API + HTML
Impressive, no? Care to try it out? Read on friend!
References
I’ll front load with all the necessary stuff. That way, you can bounce if you don’t feel like reading. You’ll get the gist if you follow along with what I’ve provided.
Much of what I learned came from the MDN Web Docs site. I would get acquainted with the following pieces of code (or at least have them handy) since they heavily influenced me (a.k.a. plagiarized).
MDN Web Docs
I either used or referenced these files in my version of the code. They are all available in the two links I mentioned above, but I’m adding them here for convenience (in case you need to leave or want to review while on this page).
๐๐ผ click these to reveal the contents
No! Not this one, dummy. This one is just an exampleโฆduh ๐ค!
response: json() method
const myList = document.querySelector("ul");
const myRequest = new Request("products.json");
fetch(myRequest)
.then((response) => response.json())
.then((data) => {
for (const product of data.products) {
const listItem = document.createElement("li");
listItem.appendChild(document.createElement("strong")).textContent =
product.Name;
listItem.append(` can be found in ${product.Location}. Cost: `);
listItem.appendChild(document.createElement("strong")).textContent =
`ยฃ${product.Price}`;
myList.appendChild(listItem);
}
})
.catch(console.error);
Check out my remixed JavaScript code, DDL and ORDS module definitions (for this example) on my GitHub blog repo. I'll also include select items below.
Here are a few things to point out:
In line 16 of myindex.html code, I referenced the JavaScript code (script.js) separately. This approach achieves the same effect as embedding the JavaScript directly into the HTML file (as seen in the MDN’s version of the index.html file).
The script.js contains the Fetch API and the JavaScript concept of “promises.” The following were super helpful for me. Maybe the will be for you too:
The JSON file contains an example of what an ORDS GET request response looks like (if viewing in the browser). The structure is nearly identical if you compare it to the MDN JSON file.
This means you can take their HTML and JavaScript code and populate it with an ORDS endpoint and [subsequent] response data (i.e., the stuff you see in this localhost.json file).
const ordsApi = "http://localhost:8080/ords/ordstest/api/example/api/";
// This next one is just an example using query parameters. I just chose a random employee number:
// const filteredOrdsApi = 'http://localhost:8080/ords/ordstest/api/example/api/?q={"empno":"7876"}';
const myList = document.querySelector("ul");
fetch(ordsApi).then((response) => {
if (!response.ok) {
throw new Error(`HTTP error, status = ${response.status}`);
}
return response.json();
}).then((data) => {
for (const item of data.items) {``
const listItem = document.createElement("p");
const empnoElement = document.createElement("strong");
empnoElement.textContent = item.empno;
const enameElement = document.createElement("strong");
enameElement.textContent = `${item.ename}`;
const dnameElement = document.createElement("strong");
dnameElement.textContent = `${item.dname}`;
const jobElement = document.createElement("strong");
jobElement.textContent = `${item.job}`;
listItem.append(`Employee number `, empnoElement, ` was hired on ${item.hiredate}.`, ` Their last name is `, enameElement, ` and they work as a `, jobElement, ` in the `, dnameElement, ` department.`
);
myList.appendChild(listItem);
}
})
.catch((error) => {
const p = document.createElement("p");
p.appendChild(document.createTextNode(`Error: ${error.message}`));
document.body.insertBefore(p, myList);
});
I’m also using the Live Server extension for VS Code. If you don’t have it, you’ll need it to run the code I’ve provided. You can download it from the VS Code Marketplace here.
You’ll want Live Server for this one!
How I met your Mothra ๐พ
Where to start? From the beginning, right? What you see below are two JSON files. On the left, from ORDS. On the right, from the MDN Web Docs sample code (direct link to that file).
Comparing JSรN
ORDS on the left, MDN on the right.
They are nearly identical. They are both a JSON object {} comprised of key: value pairs, where the first key’s value is an array []. In both files, this array has moreobjects {}. And each of those objects has its ownkey: value pairs…marone ๐ค๐ผ!
I mention all this because this makes the existing code easy to work with. Which you’ll see shortly.
Comparing JavaScript
Next is the JavaScript code; I’ll compare both my version and the MDN Web Docs version.
ORDS on the left; can you spot the differences?
You’ll notice that a lot of the code is quite similar. I kept it this way, so I wouldn’t unintentionally break anything. The main differences in my code are the:
const ordsAPI on Line 1 (as opposed to referencing a JSON file).
Naming conventions in lines 14-27.
listItem.append(); on line 29 is heavily remixed (I did this so I could create individual lines for each entry).
Templating in my code (i.e., wherever you see the little ``` marks; they allow you to embed text directly into the HTML) I use A LOT more of it!
About the ORDS JSON Object
If you were to navigate to your ORDS endpoint, it would look like the images below. I’m including them for a couple of reasons:
You can see those key: value pairs in a different presentation.
These images help connect what is coming through in that GET request and what you see in the JavaScript code.
The items key with its value (an array).Remember the other key: value pairs, too!
Reviewing the HTML
Assuming you’ve started up Live Server (along with setting up your environment to mimic my own), you’ll immediately see this beauty of a web page. This image alone doesn’t tell a complete story, though.
Review line 29 in the JavaScript code; it’ll help to “connect the dots.”
However, when you open up the developer tools in your browser, you’ll see what is happening under the covers.
Live Server starts up, sees the index.html file, and “serves” it up.
In that HTML file is a reference to script.js; the JavaScript is run.
The JavaScript composes a list and then appends all the data you see here (on screen):
With developer tools open, you can see the HTML. This HTML should look similar to lines 12-27 of the JavaScript code.
Summary
After writing this up, I’m realizing this clearly needs to be a video. But if you get it, great! Otherwise, stay tuned!
There isn’t anything ground-breaking here. I’m highlighting an example of manipulating existing ORDS JSON objects (with the Fetch API) because I hadn’t seen anything quite like what I am presenting here.
Also, the web page that I’m showing is very, very basic. I’m neither a UX nor UI designer, so this is what you get, folks!
The main point is that the ORDS APIs are effortless to work with if you have a fundamental understanding of manipulating JSON objects using JavaScript. They are no different than what you see out in the wild.
Some follow-up
I want to take this and add some React to it. And I’d also like to add authentication (Basic, OAuth 2.0, and Java Web Tokens). But baby steps.
Okay, that’s all for now, folks, Sayonara!
Follow
And don’t forget to follow, like, subscribe, share, taunt, troll, or stalk me!
A while back (yesterday), I penned a blog post highlighting the ORDS REST-Enabled SQL Service. And in that blog, I displayed the output of a cURL command. A cURL command I issued to an ORDS REST-Enabled SQL Service endpoint. Unfortunately, it was very messy and very unreadable. I mentioned that I would fix it later. Well…it’s now…later (temporal paradox, anybody ๐คจ?).
Recap
If you recall, the output of my POST request looked like this:
Yikes, you kiss your mother with that mouth?!
JSON is not displaying correctly
Well, the reason why I didn’t originally pipe in the json_pp command is because this is what happened when I attempted it:
Jefe to the rescue
After reading my newly published article, Jefe suggested I try the jq command.
The Yoda to my Padawan
Which, of course, I did. Still no luck:
Different issue though
Andiamo a googliare!
Online search to the rescue
Search online using the keywords “parse error: Invalid numeric literal at,” and you’ll quickly discover that you’re not the only one with this problem.
Five minutes of research revealed a potential culprit. What I was experiencing seemed to be a known issue. For example, a long-standing jqbug on GitHub details this exact scenario. This doesn’t seem to be a jq or json_pp issue. Instead, the problem is somehow related to the -i cURL command option and JSON parsing.
After another few minutes, as luck would have it, I found a Stack Overflow thread discussing the same issue I encountered! After scrolling to the bottom of the thread, I found this golden nugget:
Thank you Mattias and nhs503 ๐ฅฐ
Testing without -i
So, I did just what Mattias and nhs503 suggested. I removed the -i option (-i, or –include) from my cURL command, and wouldn’t you know? The damn thing works as expected! I tested while piping jq and json_pp. I also concede that jq is the prettier of the two; I appreciate the colors (although, admittedly, this would NOT pass any accessibility testing).
jq part one of the responsejq part two of the responsejson_pp part one of the responsejson_pp part two of the response
Final thoughts
And for some final thoughts…
It turns out it’s NOT ORDS – it’s something to do with an underlying JSON parser not liking the header info that is coming through
json_pp and jq both work; they output the information in different order
The ORDS REST-Enabled SQL Service returns to you not only your results, but the SQL statement initially used (that is cool and I didn’t originally realize or mention this)
And that’s it for this one! I really hope you find this useful. I hope this saves you some time from having to troubleshoot and/or hunt for a fix for this tricky problem. That’s all for now!
Follow
And don’t forget to follow, like, subscribe, share, taunt, troll, or stalk me!
I promise this post will connect back to an overarching theme. But for now, I want to show how you can take a SQL query and use that in combination with the ORDS REST-Enabled SQL Service to request data from a database table.
The SQL query
Here is the SQL query I’m using:
select * from (
select noc, sport
from olympic_medal_winners
)
pivot (min('X') for sport in (
'Archery' as arc, 'Athletics' as ath, 'Hockey' as hoc,
'Judo' as jud, 'Sailing' as sai, 'Wrestling' as wre
)
)
order by noc
fetch first 7 rows only
The SQL Script
Please feel free to cheat like me and steal this same script from the Live SQL site (direct link here). And if you can’t be bothered to do that, the script in its entirety, can be found at the bottom of the post.
PAUSE: Shout out to Chris Saxon for conceiving this. And putting in the real work. I'm both too lazy and too dumb to come up with this on my own.
The demo
Let’s assume you’ve created the table and inserted all the same data. Now, you can take a SQL query (use the same one as me, or don’t, I don’t care ๐) and run it in an SQL Worksheet (like I did here):
Coooool dude...you did a thing I already know how to do!
That’s what you’ll end up with. It’s actually a pretty neat printout; I didn’t even know you could do this! But, I want to take that SQL query and demonstrate how you can do this with the ORDS REST-Enabled SQL service.
Disclaimer
I’m performing this demo locally. I have a Podman container running with an Oracle database therein (one I grabbed from our Oracle Container Registry). I’ve also installed ORDS in this database and used my ORDSTEST user (the same setup as in my original Podman/ORDS how-to post).
About the REST-Enabled SQL Service
How do I set this service up? When you first install ORDS, if you enable Database Actions, you are also enabling this REST-Enabled SQL Service. You’ll see it in this step in the ORDS Interactive Installer:
Enter a number to select additional feature(s) to enable:
[1] Database Actions (Enables all features)
[2] REST Enabled SQL and Database API
[3] REST Enabled SQL
[4] Database API
[5] None
Choose [1]:
See it? If you select [1], then you are also enabling features [2], [3], and [4].
PRO TIP: Basically, if you can sign into Database Actions, then you're good. You're all set.
Long story short. You are taking that whacky SQL from the above example (something that would take me a week to come up with on my own) and passing it as a payload in a POST request to your REST-Enabled SQL Service ๐ค endpoint.
Since I am doing this locally, my REST-Enabled SQL Service endpoint looks like this:
You should see something similar; your REST-enabled schema alias will differ however
Next, with my Terminal open (and ORDS running, duh!) I’ll issue the following command:
Ah-ha! You probably noticed that the --data-binaryoption references a separate sportCountryMatrix.sql file. That’s because our docs recommend using an SQL file for multi-line SQL statements (like my example). I bet you could pass in this multi-line statement via the command line, but that seems unnecessarily challenging. Plus, I KNOW this works.
A quick review of the directory setup
I want to quickly review how I created this file, mostly remaining in Terminal. I first created a new ordsSqlService directory on my desktop. Then, I made an empty sportCountryMatrix.sql file.
Using VIM, I opened that file, pasted my choice SQL statement, saved it, and exited. I wanted to mention this because when I executed that cURL command, it worked because Iwas in the same directory as the SQL file!
Here are some screenshots of me going through those steps. You’ll see me creating the file but then also using the cat command so that I can double-check the contents of the .sql file.
And now, back to the cURL command. After issuing the command, here is the response to the POST request:
I know the response isn’t the most readable, but I can figure that out another time. (I have something else planned as a follow-up to this post). But it’s all there, trust me (I’m a doctor)!
Also, this blog post was about 50% me messing around and 50% reminding YOU that ORDS is capable of this (right out of the box, with the correct switches turned on). So, hopefully, you get the gist ๐.
Takeaways
Let me close this out with some final thoughts…
If you can sign into Database Actions, then you can take advantage of the REST-Enabled SQL Service
I haven’t explored how to pretty print the JSON response so it is more readable (and yes, I tried piping in | json_pp; it didn’t work)
Update on #2…I actually did figure this out. Read about that here.
You have to use your database username and password; this isn’t ideal for two reasons:
Security
Resource consumption (Basic Authentication can become costly, quick)
You can take pretty much any SQL query and turn it into a “Resource”
For instance, if you are an analyst, you can take that SQL query, save it as a file, and pass it in your cURL command to get precisely what you want.
Follow
And don’t forget to follow, like, subscribe, share, taunt, troll, or stalk me!
-- REM Script: Pivot and unpivot examples using Olympic data
-- REM Examples of pivoting and unpivoting data. Uses a subset of -- the results from the Rio Olympics as a data source.
-- For further explanation of the scripts, read the following blog -- post:
-- https://blogs.oracle.com/sql/entry/how_to_convert_rows_to
create table olympic_medal_winners (
olympic_year int,
sport varchar2( 30 ),
gender varchar2( 1 ),
event varchar2( 128 ),
medal varchar2( 10 ),
noc varchar2( 3 ),
athlete varchar2( 128 )
);
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Archery','M','Men''s Individual','Gold','KOR','KU Bonchan');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Archery','M','Men''s Individual','Silver','FRA','VALLADONT Jean-Charles');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Archery','M','Men''s Individual','Bronze','USA','ELLISON Brady');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Archery','M','Men''s Team','Gold','KOR','Republic of Korea');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Archery','M','Men''s Team','Bronze','AUS','Australia');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Archery','M','Men''s Team','Silver','USA','United States');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Artistic Gymnastics','M','Men''s Floor Exercise','Gold','GBR','WHITLOCK Max');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Artistic Gymnastics','M','Men''s Floor Exercise','Bronze','BRA','MARIANO Arthur');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Artistic Gymnastics','M','Men''s Floor Exercise','Silver','BRA','HYPOLITO Diego');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Artistic Gymnastics','M','Men''s Horizontal Bar','Gold','GER','HAMBUECHEN Fabian');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Artistic Gymnastics','M','Men''s Horizontal Bar','Bronze','GBR','WILSON Nile');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Artistic Gymnastics','M','Men''s Horizontal Bar','Silver','USA','LEYVA Danell');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Athletics','M','Men''s 10,000m','Gold','GBR','FARAH Mohamed');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Athletics','M','Men''s 10,000m','Bronze','ETH','TOLA Tamirat');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Athletics','M','Men''s 10,000m','Silver','KEN','TANUI Paul Kipngetich');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Athletics','M','Men''s 100m','Gold','JAM','BOLT Usain');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Athletics','M','Men''s 100m','Silver','USA','GATLIN Justin');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Athletics','M','Men''s 100m','Bronze','CAN','DE GRASSE Andre');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Badminton','M','Men''s Doubles','Gold','CHN','Zhang');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Badminton','M','Men''s Doubles','Bronze','GBR','Langridge');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Badminton','M','Men''s Doubles','Bronze','GBR','Ellis');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Badminton','M','Men''s Doubles','Silver','MAS','Tan');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Badminton','M','Men''s Doubles','Silver','MAS','Goh');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Badminton','M','Men''s Doubles','Gold','CHN','Fu');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Beach Volleyball','M','Men','Gold','BRA','Cerutti');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Beach Volleyball','M','Men','Gold','BRA','Oscar Schmidt');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Beach Volleyball','M','Men','Silver','ITA','Nicolai');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Beach Volleyball','M','Men','Silver','ITA','Lupo');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Beach Volleyball','M','Men','Bronze','NED','Meeuwsen');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Beach Volleyball','M','Men','Bronze','NED','Brouwer');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Boxing','M','Men''s Bantam (56kg)','Gold','CUB','RAMIREZ Robeisy');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Boxing','M','Men''s Bantam (56kg)','Bronze','UZB','AKHMADALIEV Murodjon');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Boxing','M','Men''s Bantam (56kg)','Bronze','RUS','NIKITIN Vladimir');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Boxing','M','Men''s Bantam (56kg)','Silver','USA','STEVENSON Shakur');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Boxing','M','Men''s Fly (52kg)','Gold','UZB','ZOIROV Shakhobidin');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Boxing','M','Men''s Fly (52kg)','Bronze','CHN','HU Jianguan');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Slalom','M','Canoe Double (C2) Men','Gold','SVK','PETER Skantar');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Slalom','M','Canoe Double (C2) Men','Bronze','FRA','GAUTHIER Klauss');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Slalom','M','Canoe Double (C2) Men','Bronze','FRA','MATTHIEU Peche');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Slalom','M','Canoe Double (C2) Men','Silver','GBR','RICHARD Hounslow');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Slalom','M','Canoe Double (C2) Men','Silver','GBR','DAVID Florence');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Slalom','M','Canoe Double (C2) Men','Gold','SVK','LADISLAV Skantar');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Sprint','M','Men''s Canoe Double 1000m','Gold','GER','Brendel');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Sprint','M','Men''s Canoe Double 1000m','Bronze','UKR','Mishchuk');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Sprint','M','Men''s Canoe Double 1000m','Bronze','UKR','Ianchuk');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Sprint','M','Men''s Canoe Double 1000m','Silver','BRA','Queiroz dos Santos');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Sprint','M','Men''s Canoe Double 1000m','Silver','BRA','de Souza Silva');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Sprint','M','Men''s Canoe Double 1000m','Gold','GER','Vandrey');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Road','M','Men''s Individual Time Trial','Gold','SUI','CANCELLARA Fabian');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Road','M','Men''s Individual Time Trial','Bronze','GBR','FROOME Christopher');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Road','M','Men''s Individual Time Trial','Silver','NED','DUMOULIN Tom');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Road','M','Men''s Road Race','Gold','BEL','VAN AVERMAET Greg');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Road','M','Men''s Road Race','Silver','DEN','FUGLSANG Jakob');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Road','M','Men''s Road Race','Bronze','POL','MAJKA Rafal');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Track','M','Men''s Keirin','Gold','GBR','KENNY Jason');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Track','M','Men''s Keirin','Bronze','MAS','AWANG Azizulhasni');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Track','M','Men''s Keirin','Silver','NED','BUCHLI Matthijs');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Track','M','Men''s Omnium','Gold','ITA','VIVIANI Elia');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Track','M','Men''s Omnium','Bronze','DEN','HANSEN Lasse Norman');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Track','M','Men''s Omnium','Silver','GBR','CAVENDISH Mark');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Diving','M','Men''s 10m Platform','Gold','CHN','CHEN Aisen');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Diving','M','Men''s 10m Platform','Bronze','USA','BOUDIA David');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Diving','M','Men''s 10m Platform','Silver','MEX','SANCHEZ German');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Diving','M','Men''s 3m Springboard','Gold','CHN','CAO Yuan');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Diving','M','Men''s 3m Springboard','Silver','GBR','LAUGHER Jack');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Diving','M','Men''s 3m Springboard','Bronze','GER','HAUSDING Patrick');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Equestrian','X','Dressage Individual','Gold','GBR','DUJARDIN Charlotte');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Equestrian','X','Dressage Individual','Bronze','GER','BRORING-SPREHE Kristina');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Equestrian','X','Dressage Individual','Silver','GER','WERTH Isabell');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Equestrian','X','Dressage Team','Gold','GER','Germany');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Equestrian','X','Dressage Team','Bronze','USA','United States');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Equestrian','X','Dressage Team','Silver','GBR','Great Britain');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Fencing','M','Men''s Foil Individual','Gold','ITA','GAROZZO Daniele');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Fencing','M','Men''s Foil Individual','Silver','USA','MASSIALAS Alexander');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Fencing','M','Men''s Foil Individual','Bronze','RUS','SAFIN Timur');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Fencing','M','Men''s Foil Team','Gold','RUS','Russian Federation');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Fencing','M','Men''s Foil Team','Bronze','USA','United States');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Fencing','M','Men''s Foil Team','Silver','FRA','France');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Handball','M','Men','Gold','DEN','Denmark');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Handball','M','Men','Silver','FRA','France');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Handball','M','Men','Bronze','GER','Germany');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Handball','W','Women','Gold','RUS','Russian Federation');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Handball','W','Women','Silver','FRA','France');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Handball','W','Women','Bronze','NOR','Norway');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Hockey','M','Men','Gold','ARG','Argentina');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Hockey','M','Men','Silver','BEL','Belgium');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Hockey','M','Men','Bronze','GER','Germany');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Hockey','W','Women','Gold','GBR','Great Britain');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Hockey','W','Women','Silver','NED','Netherlands');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Hockey','W','Women','Bronze','GER','Germany');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Judo','M','Men +100 kg','Gold','FRA','RINER Teddy');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Judo','M','Men +100 kg','Bronze','BRA','SILVA Rafael');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Judo','M','Men +100 kg','Bronze','ISR','SASSON Or');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Judo','M','Men +100 kg','Silver','JPN','HARASAWA Hisayoshi');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Judo','M','Men -100 kg','Gold','CZE','KRPALEK Lukas');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Judo','M','Men -100 kg','Bronze','FRA','MARET Cyrille');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Modern Pentathlon','M','Men''s Individual','Gold','RUS','LESUN Alexander');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Modern Pentathlon','M','Men''s Individual','Silver','UKR','TYMOSHCHENKO Pavlo');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Modern Pentathlon','M','Men''s Individual','Bronze','MEX','HERNANDEZ USCANGA Ismael Marcelo');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Modern Pentathlon','W','Women''s Individual','Gold','AUS','ESPOSITO Chloe');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Modern Pentathlon','W','Women''s Individual','Silver','FRA','CLOUVEL Elodie');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Modern Pentathlon','W','Women''s Individual','Bronze','POL','NOWACKA Oktawia');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rhythmic Gymnastics','W','Group All-Around','Gold','RUS','Russian Federation');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rhythmic Gymnastics','W','Group All-Around','Bronze','BUL','Bulgaria');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rhythmic Gymnastics','W','Group All-Around','Silver','ESP','Spain');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rhythmic Gymnastics','W','Individual All-Around','Gold','RUS','MAMUN Margarita');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rhythmic Gymnastics','W','Individual All-Around','Silver','RUS','KUDRYAVTSEVA Yana');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rhythmic Gymnastics','W','Individual All-Around','Bronze','UKR','RIZATDINOVA Ganna');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rowing','M','Lightweight Men''s Double Sculls','Gold','FRA','Azou');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rowing','M','Lightweight Men''s Double Sculls','Bronze','NOR','Brun');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rowing','M','Lightweight Men''s Double Sculls','Bronze','NOR','Strandli');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rowing','M','Lightweight Men''s Double Sculls','Silver','IRL','O''Donovan');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rowing','M','Lightweight Men''s Double Sculls','Silver','IRL','O''Donovan');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rowing','M','Lightweight Men''s Double Sculls','Gold','FRA','Houin');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Sailing','M','470 Men','Gold','CRO','Fantela');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Sailing','M','470 Men','Bronze','GRE','Kagialis');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Sailing','M','470 Men','Bronze','GRE','Mantis');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Sailing','M','470 Men','Silver','AUS','Ryan');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Sailing','M','470 Men','Silver','AUS','Belcher');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Sailing','M','470 Men','Gold','CRO','Marenic');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Swimming','M','Men''s 100m Backstroke','Gold','USA','MURPHY Ryan');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Swimming','M','Men''s 100m Backstroke','Bronze','USA','PLUMMER David');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Swimming','M','Men''s 100m Backstroke','Silver','CHN','XU Jiayu');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Swimming','M','Men''s 100m Breaststroke','Gold','GBR','PEATY Adam');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Swimming','M','Men''s 100m Breaststroke','Bronze','USA','MILLER Cody');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Swimming','M','Men''s 100m Breaststroke','Silver','RSA','VAN DER BURGH Cameron');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Table Tennis','M','Men''s Singles','Gold','CHN','MA Long');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Table Tennis','M','Men''s Singles','Bronze','JPN','MIZUTANI Jun');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Table Tennis','M','Men''s Singles','Silver','CHN','ZHANG Jike');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Table Tennis','M','Men''s Team','Gold','CHN','China');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Table Tennis','M','Men''s Team','Bronze','GER','Germany');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Table Tennis','M','Men''s Team','Silver','JPN','Japan');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Taekwondo','M','Men +80kg','Gold','AZE','ISAEV Radik');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Taekwondo','M','Men +80kg','Bronze','KOR','CHA Dongmin');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Taekwondo','M','Men +80kg','Bronze','BRA','SIQUEIRA Maicon');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Taekwondo','M','Men +80kg','Silver','NIG','ISSOUFOU ALFAGA Abdoulrazak');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Taekwondo','M','Men -58kg','Gold','CHN','ZHAO Shuai');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Taekwondo','M','Men -58kg','Silver','THA','HANPRAB Tawin');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Tennis','M','Men''s Doubles','Gold','ESP','Lopez');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Tennis','M','Men''s Doubles','Bronze','USA','Johnson');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Tennis','M','Men''s Doubles','Bronze','USA','Sock');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Tennis','M','Men''s Doubles','Silver','ROU','Tecau');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Tennis','M','Men''s Doubles','Silver','ROU','Mergea');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Tennis','M','Men''s Doubles','Gold','ESP','Nadal');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Trampoline Gymnastics','M','Men','Gold','BLR','HANCHAROU Uladzislau');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Trampoline Gymnastics','M','Men','Silver','CHN','DONG Dong');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Trampoline Gymnastics','M','Men','Bronze','CHN','GAO Lei');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Trampoline Gymnastics','W','Women','Gold','CAN','MACLENNAN Rosannagh');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Trampoline Gymnastics','W','Women','Silver','GBR','PAGE Bryony');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Trampoline Gymnastics','W','Women','Bronze','CHN','LI Dan');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Triathlon','M','Men','Gold','GBR','BROWNLEE Alistair');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Triathlon','M','Men','Silver','GBR','BROWNLEE Jonathan');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Triathlon','M','Men','Bronze','RSA','SCHOEMAN Henri');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Triathlon','W','Women','Gold','USA','JORGENSEN Gwen');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Triathlon','W','Women','Silver','SUI','SPIRIG HUG Nicola');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Triathlon','W','Women','Bronze','GBR','HOLLAND Vicky');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Volleyball','M','Men','Gold','BRA','Brazil');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Volleyball','M','Men','Silver','ITA','Italy');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Volleyball','M','Men','Bronze','USA','United States');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Volleyball','W','Women','Gold','CHN','China');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Volleyball','W','Women','Silver','SRB','Serbia');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Volleyball','W','Women','Bronze','USA','United States');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Water Polo','M','Men','Gold','SRB','Serbia');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Water Polo','M','Men','Silver','CRO','Croatia');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Water Polo','M','Men','Bronze','ITA','Italy');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Water Polo','W','Women','Gold','USA','United States');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Water Polo','W','Women','Silver','ITA','Italy');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Water Polo','W','Women','Bronze','RUS','Russian Federation');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Weightlifting','M','Men''s +105kg','Gold','GEO','TALAKHADZE Lasha');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Weightlifting','M','Men''s +105kg','Bronze','GEO','TURMANIDZE Irakli');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Weightlifting','M','Men''s +105kg','Silver','ARM','MINASYAN Gor');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Weightlifting','M','Men''s 105kg','Gold','UZB','NURUDINOV Ruslan');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Weightlifting','M','Men''s 105kg','Bronze','KAZ','ZAICHIKOV Alexandr');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Weightlifting','M','Men''s 105kg','Silver','ARM','MARTIROSYAN Simon');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Wrestling','M','Men''s Freestyle 125 kg','Gold','TUR','AKGUL Taha');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Wrestling','M','Men''s Freestyle 125 kg','Bronze','BLR','SAIDAU Ibrahim');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Wrestling','M','Men''s Freestyle 125 kg','Bronze','GEO','PETRIASHVILI Geno');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Wrestling','M','Men''s Freestyle 125 kg','Silver','IRI','GHASEMI Komeil Nemat');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Wrestling','M','Men''s Freestyle 57 kg','Gold','GEO','KHINCHEGASHVILI Vladimer');
Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Wrestling','M','Men''s Freestyle 57 kg','Bronze','AZE','ALIYEV Haji');
-- This pivots the results by medal. But the columns not listed
-- in the pivot form an implicit group by. So this gives the
-- medal total per athlete per event.
select * from olympic_medal_winners
pivot ( count(*) for medal in (
'Gold' gold, 'Silver' silver, 'Bronze' bronze
))
order by noc
fetch first 6 rows only;
-- To overcome the problem in the previous statement,
-- his selects just the columns you need in the subquery.
-- But some events have multiple people who win the
-- same medal - e.g. doubles tennis. This pivot
-- counts rows in the table, not individual events.
select * from (
select noc, medal from olympic_medal_winners
)
pivot ( count(*) for medal in (
'Gold' gold, 'Silver' silver, 'Bronze' bronze
))
order by 2 desc, 3 desc, 4 desc
fetch first 5 rows only;
-- This solves the over counting problem in the
-- previous statement. It does this by finding the
-- distinct values for sport, event and gender
-- then counting the results.
select * from (
select noc, medal, sport, event, gender
from olympic_medal_winners
)
pivot ( count(distinct sport ||'#'|| event ||'#'||gender ) for medal in (
'Gold' gold, 'Silver' silver, 'Bronze' bronze
))
order by 2 desc, 3 desc, 4 desc
fetch first 5 rows only;
-- You can have many functions in the pivot.
-- Oracle generates a column for each function
-- per value in the in clause. This finds the
-- gold medal winning countries. For each it shows:
-- - The number of different events these were won in
-- - The number of different sports thy were won in
-- - The names of the athlete or team who won each medal
-- Finally it filters to only show those countries
-- that won at least two gold medals.
select * from (
select noc, medal, sport, event, gender, athlete
from olympic_medal_winners
)
pivot (
count( distinct sport ||'#'|| event ||'#'|| gender ) medals,
count( distinct sport ) sports,
listagg( athlete, ',') within group (order by athlete) athletes
for medal in ( 'Gold' gold )
)
where gold_medals > 1
order by gold_medals, gold_sports, noc
fetch first 5 rows only;
-- This is similar to the previous query.
-- But it finds those countries whose IOC code starts with D.
select * from (
select noc, medal, sport, event, gender, athlete
from olympic_medal_winners
)
pivot (
count( distinct sport ||'#'|| event ||'#'|| gender ) medals,
count( distinct sport ) sports,
listagg( athlete, ',') within group (order by athlete) athletes
for medal in ( 'Gold' gold )
)
where noc like 'D%'
order by gold_medals;
-- This produces a matrix, sports across the
-- top countries down the side. There's an X
-- for each sport that country has a row in the table for.
select * from (
select noc, sport
from olympic_medal_winners
)
pivot (min('X') for sport in (
'Archery' as arc, 'Athletics' as ath, 'Hockey' as hoc,
'Judo' as jud, 'Sailing' as sai, 'Wrestling' as wre
)
)
order by noc
fetch first 7 rows only;
-- This is the old school, pre Oracle Database
-- 11g method for pivoting data.
select noc,
count ( case when medal = 'Gold' then 1 end ) gold_medals,
count ( case when medal = 'Silver' then 1 end ) silver_medals,
count ( case when medal = 'Bronze' then 1 end ) bronze_medals
from olympic_medal_winners
group by noc
order by 2 desc, 3 desc, 4 desc
fetch first 5 rows only;
-- An example of how to build the pivot clause
-- values dynamically.
-- Note that when you do this the number of
-- columns can change between runs. So the
-- execute and fetch routine will be far
-- more complex in a real world scenario!
declare
sql_stmt clob;
pivot_clause clob;
begin
select listagg('''' || sport || ''' as "' || sport || '"', ',') within group (order by sport)
into pivot_clause
from (select distinct sport from olympic_medal_winners);
sql_stmt := 'select * from (select noc, sport from olympic_medal_winners)
pivot (count(*) for sport in (' || pivot_clause || '))';
dbms_output.put_line( sql_stmt );
execute immediate sql_stmt;
end;
/
-- The XML keyword dynamically builds the
-- list of values to pivot. But you get the
-- results in XML! Each "column" is an
-- element in this document.
select * from (
select noc, sport
from olympic_medal_winners
)
pivot xml (count(*) medal_winners for sport in (
select sport
from olympic_medal_winners
where sport like 'A%')
)
where rownum = 1;
-- This previous example gave every country
-- at least one medal in every sport! To
-- avoid this, you need to count a column
-- which will be null if the country
-- didn't win in a particular event.
select * from (
select noc, sport, athlete
from olympic_medal_winners
)
pivot xml (count(athlete) medal_winners for sport in (
select sport
from olympic_medal_winners
where sport like 'A%')
)
where rownum = 1;
-- This creates the final medal table
-- for the unpivot example below.
create table olympic_medal_tables as
select * from (
select noc, medal, sport, event, gender
from olympic_medal_winners
)
pivot ( count(distinct sport ||'#'|| event ||'#'||gender ) for medal in (
'Gold' gold_medals, 'Silver' silver_medals, 'Bronze' bronze_medals
))
order by 2 desc, 3 desc, 4 desc;
-- Unpivot takes the columns and
-- converts them back to rows.
select * from olympic_medal_tables
unpivot (medal_count for medal_colour in (
gold_medals as 'GOLD',
silver_medals as 'SILVER',
bronze_medals as 'BRONZE'
))
order by noc
fetch first 6 rows only;
drop table olympic_medal_tables purge;
create table olympic_medal_tables as
select * from (
select noc, medal, sport, event, gender
from olympic_medal_winners
)
pivot ( count(distinct sport ||'#'|| event ||'#'||gender ) medals,
count(distinct sport) sports
for medal in (
'Gold' gold, 'Silver' silver, 'Bronze' bronze
))
order by 2 desc, 4 desc, 6 desc;
-- You can unpivot two or more columns
-- to a single row. To do this, provide a
-- list of the columns you want to combine.
-- You then get a column for each in the results.
select * from olympic_medal_tables
unpivot ((medal_count, sport_count) for medal_colour in (
(gold_medals, gold_sports) as 'GOLD',
(silver_medals, silver_sports) as 'SILVER',
(bronze_medals, bronze_sports) as 'BRONZE'
))
fetch first 9 rows only;
drop table olympic_medal_tables purge;
create table olympic_medal_tables as
select * from (
select noc, medal, sport, event, gender, athlete
from olympic_medal_winners
)
pivot ( count(distinct sport ||'#'|| event ||'#'||gender ) medals,
listagg(athlete, ',') within group (order by athlete) athletes
for medal in (
'Gold' gold, 'Silver' silver, 'Bronze' bronze
))
order by 2 desc, 4 desc, 6 desc;
-- Another example of unpivoting multiple columns.
-- This time with a list of athletes.
select * from olympic_medal_tables
unpivot ((medal_count, athletes) for medal_colour in (
(gold_medals, gold_athletes) as 'GOLD',
(silver_medals, silver_athletes) as 'SILVER',
(bronze_medals, bronze_athletes) as 'BRONZE'
))
where medal_colour = 'GOLD'
and medal_count = 2
order by noc
fetch first 3 rows only;
-- This first unpivots the results to get the
-- list of athletes won two gold medals. It
-- then uses XML tokenization to split
-- the list into a row per person.
with rws as (
select * from olympic_medal_tables
unpivot ((medal_count, athletes) for medal_colour in (
(gold_medals, gold_athletes) as 'GOLD',
(silver_medals, silver_athletes) as 'SILVER',
(bronze_medals, bronze_athletes) as 'BRONZE'
))
where medal_colour = 'GOLD'
and medal_count = 2
)
select noc, athlete
from rws, xmltable (
'if (contains($X,",")) then ora:tokenize($X,"\,") else $X'
passing athletes as X
columns athlete varchar2(4000) path '.'
)
order by 1, 2
fetch first 6 rows only;
-- This creates the table of medals won by
-- each country per sport for use in the
-- examples below.
create table olympic_country_sport_medals as
select * from (
select noc, sport
from olympic_medal_winners
)
pivot (count(sport) for sport in (
'Athletics' as ath, 'Artistic Gymnastics' as gym, 'Cycling Track' as cyc,
'Boxing' as box, 'Sailing' as sai
)
)
order by 1;
-- This switches the rows and columns over
-- aka a transpose. It does so by chaining
-- a pivot followed by an unpivot.
select * from olympic_country_sport_medals
pivot (
sum(ath) ath, sum(box) box, sum(gym) gym, sum(sai) sai, sum(cyc) cyc
for noc in ('BRA' BRA, 'CHN' CHN, 'DEN' DEN, 'ESP' ESP, 'ETH' ETH, 'GRE' GRE )
)
unpivot (
(BRA, CHN, DEN, ESP, ETH, GRE ) for sport in (
(BRA_ATH, CHN_ATH, DEN_ATH, ESP_ATH, ETH_ATH, GRE_ATH) as 'Athletics',
(BRA_GYM, CHN_GYM, DEN_GYM, ESP_GYM, ETH_GYM, GRE_GYM) as 'Artistic Gym',
(BRA_BOX, CHN_BOX, DEN_BOX, ESP_BOX, ETH_BOX, GRE_BOX) as 'Boxing',
(BRA_SAI, CHN_SAI, DEN_SAI, ESP_SAI, ETH_SAI, GRE_SAI) as 'Sailing',
(BRA_CYC, CHN_CYC, DEN_CYC, ESP_CYC, ETH_CYC, GRE_CYC) as 'Track Cycling'
)
);
-- Tranposing data using unpivot and pivot.
-- Much easier to write than the other way around!
select * from olympic_country_sport_medals
unpivot (
(medals) for sport in ( ath, box, gym, sai, cyc )
)
pivot (
sum(medals) for noc in (
'BRA' BRA, 'CHN' CHN, 'DEN' DEN, 'ESP' ESP, 'ETH' ETH, 'GRE' GRE
)
);
I explore ETags and how they can be used in cURL commands when interacting with Oracle REST APIs. I also discuss some of the performance benefits of using ETags. This is not exhaustive, but I hope it introduces you to ETags or reminds you of their existence! But first…
LATE-BREAKING NEWS!!
A related video
FYI: I reference a CSV_DATA table throughout this post. We use it pretty extensively in this LiveLab. And we just recently presented a webinar based on that same LiveLab. You can check that out below!
Don’t know what ETags are? No worries, here is a definition:
The ETag (or entity tag) HTTP response header is an identifier for a specific version of a resource. It lets caches be more efficient and save bandwidth, as a web server does not need to resend a full response if the content was not changed. Additionally, etags help to prevent simultaneous updates of a resource from overwriting each other (“mid-air collisions”).
If the resource at a given URL changes, a new Etag value must be generated. A comparison of them can determine whether two representations of a resource are the same.
ETags can help to guarantee the provenance of your resources (like the auto-REST enabled table you’ll see shortly) but they can also ensure your applications consume fewer server/database resources, and load comparatively faster.
To illustrate how ETags work, I did some tinkering with cURL commands, ORDS, and a Podman container. Read on if ye dare…to see what I discovered!
Oracle REST APIs and ETags
A couple of weeks ago, I noticed in the cURL documentation there was support for ETags. And the cURL docs have options for both --etag-save and --etag-compare (practical examples to follow). When you use these options in your cURL commands, you’ll either:
save an eTag to a separate text file (locally, like on your desktop in my example below), or
compare the ETag (in that existing file) to an ETag that belongs to your REST-enabled resource (the CSV_DATA table, which you’ll see in a second)
Oh, that’s a lot of words! So read it again and then continue with my walkthrough. Meanwhile, I’ll spin up this Podman container.
We are back in Podman.
INFO: Want to learn more about using Podman and Oracle database tools? Check out my other two Podman-related posts here and here!
ORDS in Standalone mode
I need ORDS up and running for this demonstration, so I issued the ords serve command in my Terminal. This will launch ORDS in standalone mode (using a Jetty server, as seen in the image). Once it’s initialized, I can log into SQL Developer Web to interact with my database (remember, in this example, it lives in a Podman container).
Here, I’ve logged into SQL Developer Web as a non-ADMIN user (ORDSTEST in this case).
From the Database Actions Launchpad, I navigated to the SQL Worksheet.
And to keep this quick, I reused a table I created for that webinar we just did. I also auto-REST enabled it (so I could play with the cURL commands). Below, you’ll see it’s just a quick right-click with the mouse.
FYI: As a reminder, if you want to learn more about cURL commands, check out the LiveLabs workshop that this is based on. You can find that here.
Getting the cURL Command
Once I auto-REST enabled the CSV_DATA table, I selected the GET ALL REST cURL command.
This is the cURL command I’ll use for this experiment.
At this point, I still wasn’t sure that an ETag was sent from the server for those auto-REST-enabled resources (in this case, the CSV_DATA table). I know they are present when you build your own REST Modules with ORDS; (at the time) I was just less confident about the auto-REST resources.
SPOILER ALERT: ETags are present for auto-REST-enabled resources too (I'm dumb, and this is pretty widely known)!
–etag cURL options
Once I knew ETags were accessible for auto-REST-enabled resources, I experimented with cURL‘s --etag options (you’ll see how I implemented these in the upcoming cURL command).
The --etag-save [filename] and --etag-compare [filename] options work such that when you issue the --etag-save in addition to that initialcURL command, a single-line file will be saved to the directory you are currently in (you’ll see that file shortly).
This differs from how an application might work, but the concept is the same. You’re storing the ETag’s value somewhere accessible to the application. For my purposes, I need to keep this ETag somewhere the cURL command line utility can find it.
The initial cURL command
I hopped over to my Terminal and used that [slightly modified] cURL command (the one I previously retrieved from the SQL Worksheet). You’ll see that I included additional options/arguments:
--verbose
--etag-save
| json_pp
This is the first cURL command I issued.
FYI: Apparently, the json_pp command utility is a part of Perl. I think this ships with the macOS, but I'm not 100% sure. Do you know? It worked for me and pretty printed out my JSON response (notice how I used the pipe "|" in addition to the actual command).
When you use that --etag-save option, a file with the value of the ETag will be saved locally. You can see me retrieving that file and reviewing the ETag file (note in the above cURL command, I named the file “myobjectetag.txt“).
Listing the files in the current directory.Locating the myobjectetag.txt file.Opening the file and inspecting the ETag value.
I can now use this ETag in subsequent GET requests to determine if the resource (the CSV_DATA table) I’m requesting has changed since I last interacted with it. What would constitute a change? Maybe rows have been updated or removed; perhaps an additional column was added. Or maybe the table was restructured somehow; it could be any change.
But, let me pause briefly and explain the --verbose option.
About the verbose option
The printout from the --verbose option.The remaining JSON object is nicely printed out.
I used the --verbose option to inspect the information available when interacting with this Oracle REST endpoint. I don’t need to include it now since I know the ETag is coming through, but I left it in this cURL command example so that you could have a look yourself. You’ll see loads of information, including (but not limited to):
Connection information
The cURL version used
The Status Code returned (200 or OK in this case)
ETag info
In this example, all I care about is the presence of an ETag. I can now use that ETag in a subsequent GET request to determine if the resource on the server side has changed. Here is what the cURL command looks like with the --etag-compare option:
That cURL command looks very similar, except for that --etag-compare option. In this situation, cURL first checks to see if your ETag and the resource’s (the API endpoint on your server) ETag match. If they do, the request stops. And if you use the --verbose option, you can see what comes back from the server:
A whole bunch of nothing. Not really, though. That “If-None-Match” Header is the secret sauce, though. That is a conditional Header that is passed over to the server. Essentially it says, “If this Header value doesn’t match yours, then send over the requested data; otherwise, end the request here because we already have the information we need/. It’s stored/saved (presumably) locally.“
INFO:Read up on caches, because that's essentially what your application is going to use instead of having to go through the entire GET request/response cycle.
The request is terminated, but what does this mean from a performance perspective? Well, say you have a webpage that loads and later reloads in response to a user’s interaction (I simulated this with the two cURL commands). That page will probably need some information from the server to populate that page. In a situation like this, you could first ask your application to share your copy of the ETag with the server in a subsequent GET request header (“If-None-Match“). And if nothing has changed, you could speed up page load times by just refreshing with what you have stored in a cache while freeing up resources on your server for other processes. But this is just one example.
Possibilities with ETag
I’ve given this some thought, and I bet there are quite a few use cases where referring to an ETag before executing an HTTP method (like a GET or GET ALL) might be helpful.
You may want to periodically check to see if a resource has changed since you last interacted with it. Could you incorporate ETags into your build processes or longer-running jobs (maybe something around data analysis)?
Actually, ETags play a massive role in JSON-Relational Duality Views. We have an entire section in the ORDS docs on how to use them! And suppose you want to download a containerized version of the Oracle database 23C (the one that supports JSON-Relational Duality views). You can do that via this link (I think I should do this too and highlight some of the cool ORDS + JSON Duality View features)!
Well, this brings me to the end of this post. I’m hoping you learned something and came away with some good resources. And if you found this post helpful, please pass it along! And don’t be afraid to comment too! I’d love to hear your thoughts. Maybe I’ll even include your idea in a follow-up post ๐คฉ!
The title says it all. I’ve run through this about ten times now. But I’ll show you how to start a Podman container (with a volume attached) and install ORDS on your local machine. And then, once installed, we’ll create and REST-enable a user so that the user can take full advantage of Oracle REST APIs. (aka ORDS). I’ll finally show you how to log into a SQL Worksheet as that new user. Ready? Let’s go!
Oracle Container Registry
First, visit the registry. Navigate to the Database product category, then select Enterprise. Even better, just navigate to the 21cor23ai images directly (thanks Killian ๐! Ensure you’ve also signed into the site (otherwise, you won’t be able to pull this image).
For Podman, I’ll review the preferred way to start this container (with a volume; for persisting your data across sessions).
Volumes
Start your Podman machine with the podman machine startcommand. Then create a volume with the podman volume create command (that way, you can save data locally and use that volume each time you start your container). Now that we have the volume, we can create a new container and attach that volume simultaneously (more Podman volume info here).
There are a few ways you can attach volumes when starting a container, but I like the one I found in this video:
Ironically, this is an Oracle video. But it is one of the most straightforward ones I found. To start the container, here is the command I used:
podman run -d --name entdb213 -p 1521:1521 --mount=type=volume,source=entdb213vol,destination=/opt/oracle/oradata container-registry.oracle.com/database/enterprise:21.3.0.0
About port mapping
You’ll notice that I used the following port mapping -p 1521:1521. You can remove that leading 1521. If you do, Podman will bind any exposed port to a random port on your host (a MacBook, in my case) within an ephemeral port range. Ephemeral?? (I guess that means all/any available ports, assuming none of these).
At this point, I have created a volume and started a container (with an Oracle Enterprise database inside).
PRO TIP: If this is your first time starting a container with a database of this size, it will take several minutes. So go do some chores while you're waiting ๐คช
Altering the db password
I’ll change the database Administrator password to something I can easily remember (“oracle“) using this command:
# Original command
# docker exec <oracle-db> ./setPassword.sh <your_password>
# My modified command
podman exec entdb213 ./setPassword.sh oracle
Changing the password to something I can easily remember.
Note: There are several shell scripts included in this container; one of which is the change password script. There are more details on the Oracle Container Registry > Oracle Database Enterprise Edition page (redirects prevent me from linking directly to that page).
Downloading ORDS
Next, I’ll head to the Oracle REST Data Services download page. And download the latest ORDS build (I’ll be setting this up shortly, I’m just gathering and configuring everything now).
Once that ZIP file is in my downloads folder, I’ll unzip it. At this point, this folder will still be named ords latest. You can certainly keep it like that, but I’ve renamed it to ords_product_folder. This is similar to how we refer to it in our installation/configuration documentation (changing it might make it easier to follow along).
ORDS Configuration
There are two configuration steps I need to perform before I can begin the ORDS installation. You’ll need to set an Environment Variable to the binaries (these are in the bin folder, you should see that in the above image) found in the ords_product_folder. Secondly, you’ll need to create an ORDS Configuration folder.
WAIT: If you're still reading this, how is my approach? After some research, placing these two folders in the "Library" seemed to make the most sense. I'm not sure what the analog on a Windows machine would be though. Drop a comment below if you know!
At this point, I’m nearly finished with this initial configuration. I next opened my .zprofile file (this is the file where I’m keeping most of my environment variables) and added the following paths:
Pourquoi? I can't seem to find a definitive answer as to where these paths should be saved, but this thread on Stack Exchange does a great job explaining all these files (and when/where they are used).
ORDS Installation
You’ll want to exit out of all your Terminal sessions so that the next session can pick up those changes to the .zprofile file. Podman will still keep doing its thing in the background, and hopefully, by this time, the database container will display as “healthy.”
๐ STOP: This may be obvious to you, but it wasn't to me, the database needs to be ready (healthy), online, and active (whatever you want to call it) for ORDS to install. You can always issue the podman ps command to check the status of the container.
Remember this; you’ll need it shortly.
ORDS install, the first attempt
In a new Terminal, I’ll issue the ords installcommand. If you’ve set up your bin and config environmental variables like me, then you shouldn’t have any issues. Moving through the first few steps is easy.
The ORDS interactive installer will default to recommended settings. Most of the time, these will be correct. Since this is my first time installing ORDS, I’ll choose Option 2 in that first step. I can use “localhost” as the database hostname and 1521 as the port.
When you get to the database service name, that’s where you might get hung up. The ORDS installer assumes default configuration settings. But here, if you select “orcl” as the database service name, it will install ORDS in the entire database. This is not technically incorrect, but our ORDS Best Practices recommends you install ORDS in a Pluggable Database (PDB). So I’ll issue the podman logscommand (in my case: podman logs entdb213) to find the name of the PDB; ORCLPDB1 (that’s the default for this container, it’s well-documented in the container registry docs, I’m just an idiot).
This is a neat trick, but it’s also documented in the Container Registry docs.
The ORDS interactive installer is very forgiving. I’ve noticed I can exit out of the installation process pretty much anywhere. Given that fact, I’ll restart my Terminal and start the ORDS install over (now that I have the correct database service name).
ORDS install, for real this time
In these images, you can better see the ords installcommand (it was slightly grayed out in the previous image). And you’ll also see all the steps completed. You can probably keep everything default as I did. You might want to since our documentation refers to that 8080 port (as seen in the images) in most of our tutorials and examples. I find it easier to follow along with the docs when everything matches.
๐๏ธ NOTE: Make sure you choose to run ORDS in standalone mode. That way you can continue to follow along in later steps.
Here you can see the paths to the bin and config folders.Conditions are perfect.
The rest of the installation is largely unremarkable. You will notice a few things, though:
The paths we’ve pointed to for the config and bin folders
The “settings” names(no action required by you, it’s just cool to see)
once the installation finishes, the text “Oracle REST Data Services initialized” will appear
That final image confirms that ORDS is now running in standalone mode. You can visit the following:
localhost:8080/ords/sql-developer
Logging into the SQL Worksheet, the first attempt
And try to log in with the SYS credentials.
One does not simply log into Database Actions…you must REST-enable a user first.
SPOILER ALERT: You can't ๐! Muahahahahaha!
That was a dirty trick. While ORDS is installed in ORCLPDB1, we must first create and REST-enable a user. I will make up for this dirty trick by sharing one of my favorite cheat codes for learning SQL and PL/SQL.
๐๏ธ NOTE: You cannot REST-enable the SYSTEM or SYS user.
If lucky enough, you have an Oracle Cloud Free Tier account with at least one Autonomous database provisioned.
CORPORATE SHILL ALERT: You can sign up for one here ๐.
Code cheating with the Autonomous database
Login to Database Actions as the Administrator. Navigate to User Management. In that dashboard, select the + Create User button.
In the Administration section.Click the Create User button.
When the slider appears, enter the information for the ORDSTEST user (like you see in the image here).
Once you’ve entered everything in, hit that “Show code” switch.
You’ll need to enable “Web Access” for this user. When you do this (click the switch), two things will happen:
The CONNECT and RESOURCE roles will be automatically selected for you
The Authorization required toggle will be enabled – shut this off for now
Once you’ve done that, you can select the “Show code” switch at the bottom of the slider. This will reveal the actual code that is being executed should you click the “Create User” button (which you will most certainly NOT!).
Copy and paste this into a text editor.
I copied this code and placed it into a text editor. I made one small change to the QUOTA line (at the bottom of the script).
Stole this from the ORDS Quick Start Guide.I am making a slight change to that last line.
Then I headed back to my Terminal and opened a new tab. I’ll execute this code in the database (remember, it’s in that Podman container running idle in the background this entire time) using SQLcl.
Forgot your connection string?
IF YOU FORGET the connection string format for logging in, have no fear! That Jeff Smith recently showed me the history command. I also have a couple of shorts on how I used the command:
Using the history + the number of your choice.Here is a connection string that is close enough!Entering the slightly modified string into SQLcl.
Using SQLcl to REST-enable a user
Now that I have the proper format for the connection string, I’ll adjust it so the password is correct. Then I’ll execute the code in SQLcl to create a new database user and REST-enable that user’s schema.
I changed that final line; this is what it looks like in SQLcl.
Cheat code with PL/SQL in SQLcl
I’ve just learned you can REST-enable yourself by logging into SQLcl (i.e., connecting to the database in the Podman container) and issuing the following command:
EXECUTE ORDS.ENABLE_SCHEMA;
This command assumes that you have already been granted the CONNECT and RESOURCE roles but have yet to REST-enable your schema (what allows ORDS to act as the intermediary between the database and the rest of the web).
The command will employ the procedure’s default parameters, which are:
ORDS.ENABLE_SCHEMA( p_enabled IN boolean DEFAULT TRUE, p_schema IN ords_schemas.parsing_schema%type DEFAULT NULL, p_url_mapping_type IN ords_url_mappings.type%type DEFAULT 'BASE_PATH', p_url_mapping_pattern IN ords_url_mappings.pattern%type DEFAULT NULL, p_auto_rest_auth IN boolean DEFAULT NULL);
Here is what a sample output would look like, if I were signed in as the HR user:
An example output is if I were signed in as the HR user.
๐ FYI: This above image is just a sample, and not related to the rest of the images in this article. Be sure to pay attention to the connection string (sql hr/oracle@localhost:1521/freepdb1). This is if the HR user is logging into SQLcl and REST-enabling their own schema. That's why you see references to HR throughout. I don't want anybody to get confused!
Logging into the SQL Worksheet, for real this time
With all this code executed, I can NOW go back to the SQL Worksheet (remember, we’re on localhost:8080/ords/sql-developer) and log in as the newly created ORDSTEST user.
I am logging in as the ORDSTEST user.A true “will-they, won’t they” moment.Congrats, you have arrived!
And once you’re in, you’ll notice the SQL Worksheet is no different than what you might expect in the Oracle Autonomous database. So if you made it this far, go forth and CREATE, DELETE, INSERT, SELECT, and Query away, friends!
Shutting it all down
Once you are done playing and tinkering, you can log out of the SQL Worksheet, stop the ORDS process with CTL + C (on Mac, at least), stop the Podman container, and shut down the Podman virtual machine.
Use Control + C to stop the ORDS process.I am stopping the entdb213 container.Exiting from the Podman virtual machine.
And since we set this all up with a volume (so….so long ago, I know; we called it entdb213vol), you can start the container later on, and all your work will still be there (i.e., It shall persist!).
The end
Congrats, we made it! What do you think? Did I miss anything? If so, comment, and I’ll respond and update this post as needed. And if you think this could be useful for others, do share!
The plan was to create an ORACLE REST endpoint and then POST a CSV file to that auto-REST enabled table (you can see how I did that here, in section two of my most recent article). But, instead of doing this manually, I wanted to automate this POST request using Apple’s Automator application…
Me…two paragraphs from now
Follow along with the video
The plan
I did it. I went so far down the rabbit hole, I almost didn’t make it back alive. I don’t know when this ridiculous idea popped into my head, but it’s been well over a year. Until now, I either hadn’t had the time or the confidence to really tackle it.
The plan was to create an ORACLE REST endpoint and then POST a CSV file to that auto-REST enabled table (you can see how I did that here, in section two of my most recent article). But, instead of doing this manually, I wanted to automate this POST request using Apple’s Automator application.
The use case I made up was one where a person would need to periodically feed data into a table. The data doesn’t change, nor does the target table. Here is an example of the table I’m using:
The basic structure of the Bank Transfers table
And the DDL, should there be any interest:
CREATE TABLE ADMIN.BANK_TRANSFERS
(TXN_ID NUMBER ,
SRC_ACCT_ID NUMBER ,
DST_ACCT_ID NUMBER ,
DESCRIPTION VARCHAR2 (4000) ,
AMOUNT NUMBER
)
TABLESPACE DATA
LOGGING
;
Once this table was created, I auto-REST enabled the table and retrieved the complete cURL Command for performing a Batch Loadrequest. Remember, we have three examples for cURL Commands now, I chose Bash since I’m on a Mac:
Retrieving the the Batch Load cURL Command
Once I grabbed the cURL Command, I would temporarily save it to a clipboard (e.g. VS Code, TextEdit, etc.). I’d then create a new folder on my desktop.
The newly created ords_curl_post folder
How I actually did it
I’d then search via Spotlight for the Automator application. Once there, I’d choose Folder Action.
Choosing Folder Action for this automation
HEY!! README: I'm going to breeze through this. And it may seem like I am well-aquainted with this application. I am not.I spent hours debugging, reading through old StackExchange forums, and Apple documentation so I could share this with you. There is a ton more work to do. But bottom line, this thing works, and its something that is FREE and accessible for a lot of people. You could have a TON of fun with this stuff, so keep reading!
There’s no easy way to get around this, but to get really good at this, you’ll just need to tinker. Luckily, most of these automation modules are very intuitive. And there is a ton of information online on how to piece them all together.
Automator ๐ค
All of these modules are drag-and-drop, so it makes it easy to create an execution path for your Folder Action application. Eventually, I ended up with this (don’t worry, I’ll break it down some, a video is in the works for a more detailed overview):
Complete Folder Action automation for the ORDS Batch Load request
The modules
The modules I’m using are:
Get Specified Finder Items
Get Folder Contents
Run Shell Script (for a zsh shell, the default for this MacBook)
Set Value of Variable
Get Value of Variable
Display Notification
You can see at the very top, that I have to choose a target folder since this is a folder action. I chose the folder I created; ords_curl_post.
Get Specified Finder Items and Get Folder Contents
The first two modules are pretty straightforward. You get the specified finder items (from that specific folder). And then get the contents from that folder (whatever CSV file I drop in there). That will act as a trigger for running the shell script (where the filename/s serve as the input for the cURL Command).
PAUSE: I must confess, I had essentially ZERO experience in shell scripting prior to this, and I got it to work. Its probably not the prettiest, but damn if I'm not stoked that this thing actually does what it is supposed to do.
The only main considerations on this shell script are that you’ll want to stay with zsh and you’ll want to choose “as arguments” in the “Pass input” dropdown menu. Choosing “as arguments” allows you to take that file name and apply it to the For Loop in the shell script. I removed the echo "$f" because all it was doing was printing out the file name (which makes sense since it was the variable in this script).
Choosing “as arguments“
The Shell Script
That cURL Command I copied from earlier looks like this:
I made some modifications though. I made sure Content-Type was text/csv. And then I added some fancy options for additional information (more details on this here, go nuts) when I get a response from the database.
REMINDER: I didn't know how to do this until about 30 mins before I got it to work. I'm emphasizing this because I want to drive home the point that with time and some trial-and-error, you too can get something like this to work!
With my changes, the new cURL Command looks like this:
What a mess…That -w option stands for write-out. When I receive the response from the Batch Load request, I’ll want the following information:
Response Code (e.g. like a 200 or 400)
Total Upload Time
Upload Speed
Upload Size
All of that is completely optional. I just thought it would be neat to show it. Although, as you’ll see in a little bit, Apple notifications has some weird behavior at times so you don’t really get to see all of the output.
I then applied the cURL command to the shell script, (with some slight modifications to the For Loop), and it ended up looking like this:
New shells script with updated cURL command
Here is what the output looked like when I did a test run (with a sample CSV):
Success on the cURL command
Set Value of Variable
All of that output, referred to as “Results”, will then be set as a variable. That variable will be henceforth known as the responseOutput (Fun fact: that is called Camel casing…I learned that like 3-4 months ago). You’ll first need to create the variable, and once you run the folder action, it’ll apply the results to that variable. Like this:
Creating a new variableResults from cURL command applied to variable
Get Value of Variable and Display Notification
Those next two modules simply “GET” that value of the variable/results and then sends that value to the Display Notification module. This section is unremarkable, moving on.
And at this point, I was done. All I needed to do was save the script and then move on to the next step.
Folder Actions Setup
None of this will really work as intended until you perform one final step. I’ll right-click the target folder and select “Folder Actions Setup.” From there a dialog will appear; you’ll want to make sure both the folder and the script are checked.
Selecting Folder Actions SetupDouble checking that everything is enabled
Trying it out
Next, I emptied the folder. Then I dropped in a 5000-row CSV file and let Folder Actions do its thing. This entire process is quick! I’m loving the notification, but the “Show” button simply does not work (I think that is a macOS quirk though). However, when I go back to my Autonomous Database, I can 100% confirm that this ORDS Batch Load worked.
Successful Batch LoadDouble checking the Autonomous Database
Final thoughts
This was relatively easy to do. In total, it took me about 3-4 days of research and trial and error to get this working. There is a lot I do not know about shell scripting. But even with a rudimentary understanding, you too can get this to work.
Next, I’d like to create a dialog window for the notification (the output from the cURL Command). I believe you can do that in AppleScript; I just don’t know how yet.
If you are reading this and can think of anything, please leave a message! If you want to try it out for yourself, I’ve shared the entire workbook on my GitHub repo; which can be found here.
I’ll also be doing an extended video review of this, where I’ll recreate the entire automation from start to finish. Be on the lookout for that too!
Overview and connecting with the python-oracledb library
Part II
Connecting with Oracle REST APIs unauthenticated
Part III
Custom Oracle REST APIs with OAuth2.0 Authorization
Welcome back
I finally had a break in my PM duties to share a small afternoon project [I started a few weeks ago]. I challenged myself to a brief Python coding exercise. I wanted to develop some code that allowed me to connect to my Autonomous Database using either our python-oracledb driver (library) or with Oracle REST Data Services (ORDS).
I undertook this effort as I also wanted to make some comparisons and maybe draw some conclusions from these different approaches.
NOTE: If you don't feel like reading this drivel, you can jump straight to the repository where this code lives. It's all nicely commented and has everything you need to get it to work. You can check that out here.
The test files
Reviewing the code, I’ve created three Python test files. test1.py relies on the python-oracledb library to connect to an Oracle Autonomous database while test2.py and test3.py rely on ORDS (test3.py uses OAuth2.0, but more on that later).
test1.py using the python-oracledb librarytest2.py relies on an unsecured ORDS endpointtest3.py with ORDS, secured with OAuth2
Configuration
Configuration directory
I set up this configuration directory (config_dir) to abstract sensitive information from the test files. My ewallet.pem and tnsnames.ora files live in this config_dir. These are both required for Mutual TLS (mTLS) connection to an Oracle Autonomous database (you can find additional details on mTLS in the docs here).
ewallet.pem and tnsnames.ora files
Other files
OAuth2.0, Test URLs, and Wallet Credential files
Other files include oauth2creds.py, testurls.py, and walletcredentials.py. Depending on the test case, I’ll use some or all of these files (you’ll see that shortly).
NOTE: If not obvious to you, I wouldn't put any sensitive information into a public git repository.
Connecting with python-oracledb
One approach to connecting via your Oracle database is with the python-oracledb driver (library). An Oracle team created this library (people much more experienced and wiser than me), and it makes connecting with Python possible.
FYI: I’m connecting to my Autonomous Database. If you want to try this, refer to the documentation for using this library and the Autonomous database. You can find that here.
The Python code that I came up with to make this work:
#Connecting to an Oracle Autonomous Database using the Python-OracleDB driver.
import oracledb
# A separate python file I created and later import here. It contains my credentials, so as not to show them in this script here.
from walletcredentials import uname, pwd, cdir, wltloc, wltpwd, dsn
# Requires a config directory with ewallet.pem and tnsnames.ora files.
with oracledb.connect(user=uname, password=pwd, dsn=dsn, config_dir=cdir, wallet_location=wltloc, wallet_password=wltpwd) as connection:
with connection.cursor() as cursor:
# SQL statements should not contain a trailing semicolon (โ;โ) or forward slash (โ/โ).
sql = """select * from BUSCONFIND where location='ZAF'
order by value ASC """
for r in cursor.execute(sql):
print(r)
In Line 7, you can see how I import the wallet credentials from the walletcredentials.py file. Without that information, this code wouldn’t work. I also import the database username, password, and configuration directory (which includes the ewallet.pem and tnsnames.ora files).
From there, the code is pretty straightforward. However, some library-specific syntax is required (the complete details are in the docs, found here), but aside from that, nothing is too complicated. You’ll see the SQL statement in Lines 16-17; the proper SQL format looks like this:
SELECT * FROM busconfind WHERE location='zaf'
ORDER BY value ASC;
And here is an example of this SQL output in a SQL Worksheet (in Database Actions):
Reviewing the SQL in Database Actions
FYI: This is a Business Confidence Index data-set, in case you were curious (retrieved here).
That SQL allows me to filter on a Location and then return those results in ascending orderaccording to the Value column. When I do this using the python-oracledb driver, I should expect to see the same results.
NOTE: You've probably noticed that the SQL in the python file differs from that seen in the SQL Worksheet. That is because you need to escape the single quotes surrounding ZAF, as well as remove the trailing semi-colon in the SQL statement. Its all in the python-oracledb documentation, you just have to be aware of this.
Once I have all the necessary information in my walletcredentials.py file, I can import that into the test1.py file and execute the code. I chose to run this in an Interactive Window (I’m using VS Code), but you can also do this in your Terminal. In the images (from left to right), you’ll see the test1.py file, then a summary of the output from that SQL query (contained in the test1.py code), and finally, the detailed output (in a text editor).
Executing the Python code in an Interactive WindowSummary output from test1.pyDetailed output from test1.py
Wrap-up
For those that have an existing Free Tier tenancy, this could be a good option for you. Of course, you have to do some light administration. But if you have gone through the steps to create an Autonomous database in your cloud tenancy, you probably know where to look for the tnsnames.ora and other database wallet files.
I’m not a developer, but I think it would be nice to simplify the business logic found in this Python code. Maybe better to abstract it completely. For prototyping an application (perhaps one that isn’t micro services-oriented, this could work) or for data- and business analysts, this could do the trick for you. In fact, the data is returned to you in rows of tuples; so turning this into a CSV or reading it into a data analysis library (such as pandas) should be fairly easy!
Connecting via ORDS: sans OAuth2.0
Auto-REST and cURL
I’m still using the “devuser” (although this may be unnecessary, as any unsecured REST-enabled table would do). I’m using the same table as before; the only change I’ve made is to auto-REST enable the BUSCONFIND table for the test2.py code.
In the following images, I’m retrieving the cURL command for performing a GET request on this table.
NOTE: In a recent ORDS update, we made available different shell variations (this will depend on your OS); I've selected Bash.
From there, I take the URI (learn more on URIs) portion of the cURL command and place it into my browser. Since this table is auto-REST enabled, I’ll only receive 25 rows from this table.
NOTE: The ORDS default pagination is limit = 25.
Getting the cURL command from an already ORDS REST-enabled tableSelecting the GET request for BashGET response in JSONThe raw JSON, pretty printed
The code
And the code for this test2.py looks like this:
# Auto-REST enabled with ORDS; in an Oracle Autonomous Database with query parameters.
import requests
import pprint
# Importing the base URI from this python file.
from testurls import test2_url
# An unprotected endpoint that has been "switched on" with the ORDS Auto-REST enable feature.
# Query parameters can be added/passed to the Base URI for GET-ing more discrete information.
url = (test2_url + '?q={"location":"ZAF","value":{"$gt":100},"$orderby":{"value":"asc"}}}')
# For prototyping an application, in its earlier stages, this could really work. On your front end, you
# expect the user to make certain selections, and you'll still pass those as parameters.
# But here, you do this as a query string. In later stages, you may want to streamline your application
# code by placing all this into a PL/SQL or SQL statement. Thereby separating application
# logic and business logic. You'll see this approach in the test3.py file.
# This works, but you can see how it gets verbose, quick. Its a great jumping-off point.
responsefromadb = requests.get(url)
pprint.pprint(responsefromadb.json())
Lines 8 and 13 are the two areas to focus on in this example. In Line 8 imported my URL from the testurls.py file (again, abstracting it, so it’s not in the main body of the code).
The test2.py and testurls.py files
And then, in Line 13, I appended a query string to the end of that URL. ORDS expects the query parameters to be a JSON object with the following syntax:
[ORDS Endpoint]/?q={"JSON Key": "JSON Value"}
The new, complete query string below requests the same information as was requested in the test1.py example:
This string begins with that same BASE URI for the ORDS endpoint (the auto-REST enabled BUSCONFIND table) and then applies the query string prefix “?q=” followed by the following parameters:
Filter by the location "ZAF"
Limit the search of these locations to values (in the Value column) greater than ($gt) 100
Return these results in ascending order (asc) of the Value column
NOTE: You can manipulate the offsets and limits in the python-oracledb driver too. More info found here. And filtering in queries with ORDS can be found here.
And if I run the test2.py code in the VS Code Interactive Window, I’ll see the following summary output.
Summary output from the response in test2.py
Here is a more detailed view in the VS Code text editor:
Detailed output with helpful links
Wrap-up
A slightly different approach, right? The data is all there, similar to what you saw in the test1.py example. There are a few things to note, though:
The consumer of this ORDS REST API doesn’t need access to the database (i.e. you don’t need to be an admin or have a schema); you can perform GET requests on this URI.
The response body is in JSON (ubiquitous across the web and web applications)
Also, language and framework agnostic (the JSON can be consumed/used widely, and not just with Python)
You are provided a URI for each item (i.e. entry, row, etc.)
No need for SQL; just filter with the JSON query parameters
No business logic in the application code
Needless to say, no ORMs or database modeling is required for this approach
However…security is, ahem…nonexistent. That is a problem and flies in the face of what we recommend in our ORDS Best Practices.
Connecting via ORDS: secured with OAuth2
Note: This is an abbreviated explanation, I'll be posting an expanded write-up on this example post haste!
Since this is what I’m considering “advanced” (it’s not difficult, there are just many pieces) I’m going to keep this section brief. Long story short, I’ll take those query parameters from above and place them into what is referred to as a Resource Handler.
TIME-OUT: Auto-REST enabling a database object (the BUSCONFIND table in this case) is simple in Database Actions. Its a simple left-click > REST-enable. You saw that in the previous example. You are provided an endpoint and you can use the query parameters (i.e. the JSON {key: value} pairs) to access whatever you need from that object.
However, creating a custom ORDS REST endpoint is a little different. First you create a Resource Module, next a (or many) Resource Template/s, and then a (or many) Resource Handler/s. In that Resource Handler, you'll find the related business logic code for that particular HTTP operation (the menu includes: GET, POST, PUT, and DELETE).
The Resource Module
The process of creating a custom ORDS API might be difficult to visualize, so I’ll include the steps I took along with a sample query (in that Resource Handler) to help illustrate.
Creating the Resource Module in the ORDS REST WorkshopCreating the Resource TemplateReviewing the available operations for the Resource TemplateThe newly created Resource GET HandlerPlacing the SQL directly into the Resource HandlerTesting out the code to simulate a GET request using "ZAF" as the locationReviewing the output of that SQL query, in a table format
Chances are you may be the administrator of your Always Free tenancy, so you have full control over this. Other times, you might be provided the REST endpoint. In that case, you may not ever have to worry about these steps. Either way, you can see how we’re simulating (as well as both abstracting and keeping the business logic in the database) the query with this final example (test3.py).
Security
The OAuth 2.0 authorization framework enables a third-party application to obtain limited access to an HTTP service, either on behalf of a resource owner by orchestrating an approval interaction between the resource owner and the HTTP service, or by allowing the third-party application to obtain access on its own behalf.
RFC 6749: The OAuth 2.0 Authorization Framework
I’ll keep this section brief, but I’m protecting this resource through the aid of an ORDS OAuth2.0 client. I’ve created one here:
After creating a client you can use the provided URL for requesting a new Bearer Token
And, as you’ll see shortly, I’ll rely on some Python libraries for requesting an Authorization Token to use with the related Client ID and Client Secret. If you want to nerd out on the OAuth2.0 framework, I challenge you to read this.
test3.py example
NOTE: Remember, I'm keeping this section intentionally brief. It deserves a slightly deeper dive, and class is almost over (so I'm running out of time).
The code for this example:
# Custom ORDS Module in an Oracle Autonomous Database.
import requests
from requests_oauthlib import OAuth2Session
from oauthlib.oauth2 import BackendApplicationClient
import pprint
import json
# Importing the base URI from this python file.
from testurls import test3_url
# A separate python file I created and later import here. It contains my credentials,
# so as not to show them in this script here.
from oauth2creds import token_url, client_id, client_secret
token_url = token_url
client_id = client_id
client_secret = client_secret
client = BackendApplicationClient(client_id=client_id)
oauth = OAuth2Session(client=client)
token = oauth.fetch_token(token_url, client_id=client_id, client_secret=client_secret)
bearer_token = token['access_token']
# Location can be anything from the table. Now, only the single variable needs to be passed. Business logic has been abstracted somewhat; as it now resides within
# ORDS. This could make your application more portable (to other languages and frameworks, since there are fewer idiosyncracies and dependencies):
location = "ZAF"
# print(location)
# ------------------------------------------------------------------------------ #
# In Database Actions, we:
# 1. Create an API Module
# 2. Then create a Resource Template
# 3. Finally, a GET Resource Handler that consists of the code from test1.py:
# select * from BUSCONFIND where location= :id
# order by value ASC
# ------------------------------------------------------------------------------ #
url = (test3_url + location)
# print(url)
responsefromadb = requests.get(url, headers={'Authorization': 'Bearer ' + bearer_token}).json()
# This step isn't necessary; it simply prints out the JSON response object in a more readable format.
pprint.pprint(responsefromadb)
Lines 11 and 16 deserve some attention here. The URL for Line 11 comes from the testurls.py file; seen in the previous example. And the contents from Line 16 come from the oauth2creds.py file. Here are the files, side-by-side:
The test3.py, testurls.py, and oauth2creds.py files
As you can see in the testurls.py file, I’m relying on the test3_url for this example. And the OAuth2.0 information you see comes directly from the OAuth Client I created in Database Actions:
In this image, you can see the Client ID and Client Secret
If I put that all together, I can execute the code in test3.py and “pretty print” the response in my Interactive Window. But first I need to adjust the Resource Handler’s URI (the one I copied and pasted from the “REST Workshop”). It retains the “:id” bind parameter. But the way I have this Python code set up, I need to remove it. It ends up going from this:
With that out of the way, I can run this code and review the output.
Running the test3.py code in the Interactive WindowReviewing the summary output – a JSON arrayReviewing the detailed view of the “items“Scrolling to the bottom of the GET response body to see the available links for additional items
From top-to-bottom, left-to-right you’ll see I first execute the code in the Interactive Window. From there I can review a summary of the response to my GET request. That pretty print library allows us to see the JSON array in a more readable format (one that has indentation and nesting); which you can see in the second image. The third image is a more detailed view of the first half of this response. And I include the final image to highlight the helpful URLs that are included in the response body.
Since I know my limit = 25, and the 'hasMore': True (seen in the output in that third image) exists, I know there are more items. You can adjust the limit and offset in subsequent requests, but I’ll save that for another day.
Wrap-up
You can probably tell, but this is like an expansion of the previous example. But instead of relying on the auto-REST enabling, you are in full control of the Resource Module. And while you don’t need to use OAuth2.0 it’s good practice to use it for database authentication. You can see how the response comes through a little differently, compared to the previous example, but still very similar.
In this example, I did all the work, but that might not be the case for you; much of it might be handled for you. The main thing I like about this example is that we rely on stable and popular Python libraries: requests, requests_oauthlib, and oautlib.
The fact that this is delivered as a JSON object is helpful as well (for the same reasons mentioned in the second example). And finally, I enjoy the fact that you only need to pass a single parameter from your (assumed) presentation layer to your application layer; an example might be a selection from an HTML form or drop-down menu item.
The end
We’re at the end of this fun little exercise. As I mentioned before, I will expand on this third example. There are so many steps, and I think it would be helpful for people to see a more detailed walk-through.
And be on the lookout (BOLO) for a video. There’s no way around this, but a video needs to accompany this post.
And finally, you can find all the code I review in this post in my new “blogs” repository on GitHub. I encourage you to clone, fork, spoon, ladle, knife, etc…
I feel so silly for posting this because you’ll quickly realize that I will have to leave things unfinished for now. But I was so excited that I got something to work, that I had to share!
If you’ve been following along, you know you can always find me here. But I do try my best to cross-post on other channels as well:
But given that everything I do supports the development community, audience statistics are always crucial to me. Because of this, I’ll periodically review my stats on this site and the others to get a feel for the most popular topics.
I even did a RegEx post a while back that was pretty popular too. Thankfully it wasn’t that popular, as it pained me to work through Regular Expressions.
I can quickly review site statistics on this blog, but other places, like Medium, are more challenging to decipher. Of course, you can download your Audience stats, but sadly not your Story stats ๐.
Audience stats download, but no Story stats download.
Undeterred, I wanted to see if it was somehow possible to acquire my Story stats. And it is possible, in a way…
Show and tell
If after you log into your Medium account, navigate to your stats page, open up the developer tools in your browser and navigate to your “Console.” From there, reload the page and simply observe all the traffic.
You’ll see a bunch of requests:
GET
POST
OPTION (honestly, I’ve no idea what this is, but I also haven’t looked into it yet)
My thought was that the stats content was produced through (or by) one of these API requests. So yes, I (one at a time) expanded every request and reviewed the Response Body of each request. I did that until I found something useful. And after a few minutes, there it was:
The magic GET request.
I confirmed I had struck gold by taking this URL, placing it in a new browser window, and hitting Enter. And after selecting “Raw Data,” I saw this:
Double-checking the raw JSON.
Indeed, we see my Story stats. But the final two paths in the URL made no sense to me.
The paths looked similar; I had no choice but to activate Turing Modeโข.
I could see these numbers were similar, so I lined them up in my text editor and saw that they shared the same 166 prefixes. I don’t know much about machine-readable code, but since what was appearing on my screen was the last 30 days, I thought this might be some sort of date. But I’d never seen anything like this, so I wasn’t 100% sure.
Unix Time Stamps
After about 20 mins of searching and almost giving up, I found something in our Oracle docs (a MySQL reference guide of all places) that referenced Unix Time Stamps. Eureka!
About Unix time stamps in the Oracle MySQL docs.
Success, I’d found it. So I searched for a “Unix time stamp calculator” and plugged in the numbers. My hunch was correct; it was indeed the last thirty days!
Verifying the Unix Time Stamp.
So now I’m wondering if I change that leading date in the GET request will it allow me to grab all my story statistics from January 2022 till now? Oh, hell yeah, it will!
All my Story stats from Jan 2022 to the present.
End of the line
Right, so here is where I have to leave it open-ended. I had a finite amount of time to work on this today, but what I’d like to do is see if I can authenticate with Basic Authentication into my Medium account. And at least get a 200 Response Code. Oh wait, I already did that!?
Getting that sweet, sweet 200 Response Code.
And now the Python code!
import requests
import json
from requests.auth import HTTPBasicAuth
url = "https://medium.com/m/signin"
# I found this to work even if I typically sign on through
# the Google Single-sign-on. I just used the same email/password
# I do when I login directly to google (Gmail).
user = "[Your login/email]"
password = "[Your password]"
r = requests.get(url, auth=HTTPBasicAuth(user, password))
print(r)
# I found this URL in the console but then removed everything after
# the query string (the "?"), and used that for the requests URL
# "/m/signin?operation=login&redirect=https%3A%2F%2Fmedium.com%2F&source=--------------------------lo_home_nav-----------"
You’re probably wondering how I found the correct URL for the Medium login page. Easy, I trolled the Console until I found the correct URL. This one was a little tricky, but I got it to work after some adjusting. I initially found this:
And since I thought everything after that “?” was an optional querystring, I just removed it and added the relevant parts to Medium’s base URL to get this:
If I want to keep it as is, I know I can load the JSON with a cURL command and an ORDS Batch Load API with ease. I dropped this into my Autonomous Database (Data Load) to see what it would look like:
My CLOB.
We do something very similar in the Oracle LiveLabs workshop (I just wrote about it here). You can access the workshop here!
I’ll have a follow-up to this. But for now, this is the direction I am headed. If you are reading this, and want to see more content like this, let me know! Leave a comment, retweet, like, whatever. So that I know I’m not developing carpal tunnel for no reason ๐คฃ.
Recently Jeff and I were invited by the Oracle Developers and Developer Relations teams to do a walkthrough of a LiveLabs workshop, โHow to Build Powerful and Secure REST APIs for Your Oracle Autonomous Database.โ
We spent about 90 minutes moving through selected labs in the workshop. Luckily they recorded it for us; you can watch it in all its glory here.
If that video piques your interest, I encourage you to complete the workshop since it provides an excellent overview of Oracle REST Data Services APIs โ specifically when working in Database Actions (in the Oracle Autonomous Database).
About the workshop
Labs 1, 2, and 7 are common across many workshops. These were our focus.
The workshop consists of seven labs, but labs 3-6 were the main focus.
Two approaches to REST-enabling your Oracle database objects.
We also wanted to highlight the two ways a user could create Oracle REST APIs in Database Actions (formerly SQL Developer Web). You can jump right in with auto-REST enabling or get creative by building your Resource Modules > Templates > Handlers.
Workshop highlights
I wonโt walk through the labs in detail here, but what I will do is highlight areas that:
Were cool/worth revisiting, or
Have (or continue to) helped speed up my productivity in Database Actions (and through association with the Autonomous Database)
The videos are queued up to the related topic.
Lab 3
Lab 3 walks you through connecting to an Autonomous Database with Database Actions. From there, you create a table from a CSV file. And finally, youโll auto-REST enable the table with simple mouse clicks.
Data Loading
I’ve found no less than three GUI-based ways to load data in Database Actions.
Auto-REST enabling
We are using mouse clicks for auto-REST enabling database objects in the Oracle Autonomous Database.
Show Code toggle
The new “Show Code” toggle switch in Database Actions.
This feature isnโt limited to the SQL Worksheet; it's found across Database Actions!
cURL command options for your environment
cURL commands now provide Power Shell, Command Prompt, and Bash examples.
Lab 4
Lab 4 walks you through using a Batch Load API for loading two million+ rows into the table you previously created (in Lab 3). We also make a SQL procedure and later use PL/SQL to simulate a REST API call to the table.
We briefly discussed the Cloud Shell and Code Editor (both in Oracle Cloud Infrastructure). Click the links to learn more, they are free and included in your OCI tenancy ๐.
A crash course on query parameters
Jeff has a helpful article here (one I reference A LOT).
You can review our docs here (we mention it in several areas).
Graduating from auto-REST
A short discussion on when and why you may want to move away from auto-REST-enabled Oracle APIs to more customized Oracle REST APIs.
Lab 5
In Lab 5, you use Database Actions and the REST console to build a REST API using a parameterized PL/SQL procedure and SQL statement. We do this manually in the previous lab but then REST-enable it here (this is a continuation and refinement of the last lab).
This continues to confound me, so if you are in the same boat as me and you want me to do some more dedicated posts on this, let me know!
Lab 6
The goal of this lab was to educate you on Roles, Privileges, and OAuth 2.0 Client Authentication. Unfortunately, we ran out of time and had to speed through this final section. However, I did show off some of the OpenAPI functions within Database Actions.
OpenAPI Specifications
Specifically, we reviewed how you can view your Resource Modules in the OpenAPI view (displayed as a Swagger UI implementation). And view/execute handlers to observe their responses.
We also mentioned how you can export a Resource Module in either PL/SQL code or the OpenAPI JSON code.
I suspect you should be all set to complete this workshop (located here). But why stop the fun there? We have some other LiveLabs workshops that might interest you, too. You should check them out!
The last workshop on the list is our newest one! So if you do attempt it, feel free to create an issue for enhancements (or if anything is unclear and needs updating) on my GitHub repository ๐!