Category: SQL

  • From one API to another: Using ORDS Auto-REST to update a table

    From one API to another: Using ORDS Auto-REST to update a table

    It turns out the National Highway Traffic Safety Administration (NHTSA; pronounced ‘nit-sษ™) has a Vehicle Identification Number (VIN) decoder (it’s freely available to the public). But even cooler are the set of APIs they have. The one I found most interesting, and useful (for my use case) was the /vehicles/DecodeVinValues/[your VIN]?format=[format]l&modelyear=[year] endpoint. Here is an example, using my vehicle:

    And as a cURL command (using my 1999 Chevy Silverado ๐Ÿค ), with the subsequent server response:

    curl --request GET \
      --url 'https://vpic.nhtsa.dot.gov/api/vehicles/DecodeVinValues/1GCEK14V0XZ138493?format=json&modelyear=1999' \
    
    {
    	"Count": 1,
    	"Message": "Results returned successfully. NOTE: Any missing decoded values should be interpreted as NHTSA does not have data on the specific variable. Missing value should NOT be interpreted as an indication that a feature or technology is unavailable for a vehicle.",
    	"SearchCriteria": "VIN(s): 1GCEK14V0XZ138493",
    	"Results": [
    		{
    			"ABS": "",
    			"ActiveSafetySysNote": "",
    			"AdaptiveCruiseControl": "",
    			"AdaptiveDrivingBeam": "",
    			"AdaptiveHeadlights": "",
    			"AdditionalErrorText": "",
    			"AirBagLocCurtain": "",
    			"AirBagLocFront": "",
    			"AirBagLocKnee": "",
    			"AirBagLocSeatCushion": "",
    			"AirBagLocSide": "",
    			"AutoReverseSystem": "",
    			"AutomaticPedestrianAlertingSound": "",
    			"AxleConfiguration": "",
    			"Axles": "",
    			"BasePrice": "",
    			"BatteryA": "",
    			"BatteryA_to": "",
    			"BatteryCells": "",
    			"BatteryInfo": "",
    			"BatteryKWh": "",
    			"BatteryKWh_to": "",
    			"BatteryModules": "",
    			"BatteryPacks": "",
    			"BatteryType": "",
    			"BatteryV": "",
    			"BatteryV_to": "",
    			"BedLengthIN": "",
    			"BedType": "",
    			"BlindSpotIntervention": "",
    			"BlindSpotMon": "",
    			"BodyCabType": "",
    			"BodyClass": "Pickup",
    			"BrakeSystemDesc": "",
    			"BrakeSystemType": "Hydraulic",
    			"BusFloorConfigType": "Not Applicable",
    			"BusLength": "",
    			"BusType": "Not Applicable",
    			"CAN_AACN": "",
    			"CIB": "",
    			"CashForClunkers": "",
    			"ChargerLevel": "",
    			"ChargerPowerKW": "",
    			"CombinedBrakingSystem": "",
    			"CoolingType": "",
    			"CurbWeightLB": "",
    			"CustomMotorcycleType": "Not Applicable",
    			"DaytimeRunningLight": "",
    			"DestinationMarket": "",
    			"DisplacementCC": "4800.0",
    			"DisplacementCI": "292.91397165471",
    			"DisplacementL": "4.8",
    			"Doors": "2",
    			"DriveType": "4WD/4-Wheel Drive/4x4",
    			"DriverAssist": "",
    			"DynamicBrakeSupport": "",
    			"EDR": "",
    			"ESC": "",
    			"EVDriveUnit": "",
    			"ElectrificationLevel": "",
    			"EngineConfiguration": "V-Shaped",
    			"EngineCycles": "",
    			"EngineCylinders": "8",
    			"EngineHP": "",
    			"EngineHP_to": "",
    			"EngineKW": "",
    			"EngineManufacturer": "",
    			"EngineModel": "",
    			"EntertainmentSystem": "",
    			"ErrorCode": "0",
    			"ErrorText": "0 - VIN decoded clean. Check Digit (9th position) is correct",
    			"ForwardCollisionWarning": "",
    			"FuelInjectionType": "Multipoint Fuel Injection (MPFI)",
    			"FuelTankMaterial": "",
    			"FuelTankType": "",
    			"FuelTypePrimary": "",
    			"FuelTypeSecondary": "",
    			"GCWR": "",
    			"GCWR_to": "",
    			"GVWR": "Class 2E: 6,001 - 7,000 lb (2,722 - 3,175 kg)",
    			"GVWR_to": "",
    			"KeylessIgnition": "",
    			"LaneCenteringAssistance": "",
    			"LaneDepartureWarning": "",
    			"LaneKeepSystem": "",
    			"LowerBeamHeadlampLightSource": "",
    			"Make": "CHEVROLET",
    			"MakeID": "467",
    			"Manufacturer": "GENERAL MOTORS LLC",
    			"ManufacturerId": "984",
    			"Model": "Silverado",
    			"ModelID": "1850",
    			"ModelYear": "1999",
    			"MotorcycleChassisType": "Not Applicable",
    			"MotorcycleSuspensionType": "Not Applicable",
    			"NCSABodyType": "",
    			"NCSAMake": "",
    			"NCSAMapExcApprovedBy": "",
    			"NCSAMapExcApprovedOn": "",
    			"NCSAMappingException": "",
    			"NCSAModel": "",
    			"NCSANote": "",
    			"NonLandUse": "",
    			"Note": "",
    			"OtherBusInfo": "",
    			"OtherEngineInfo": "Iron",
    			"OtherMotorcycleInfo": "",
    			"OtherRestraintSystemInfo": "",
    			"OtherTrailerInfo": "",
    			"ParkAssist": "",
    			"PedestrianAutomaticEmergencyBraking": "",
    			"PlantCity": "ROANOKE",
    			"PlantCompanyName": "Fort Wayne Assembly",
    			"PlantCountry": "UNITED STATES (USA)",
    			"PlantState": "INDIANA",
    			"PossibleValues": "",
    			"Pretensioner": "",
    			"RearAutomaticEmergencyBraking": "",
    			"RearCrossTrafficAlert": "",
    			"RearVisibilitySystem": "",
    			"SAEAutomationLevel": "",
    			"SAEAutomationLevel_to": "",
    			"SeatBeltsAll": "",
    			"SeatRows": "",
    			"Seats": "",
    			"SemiautomaticHeadlampBeamSwitching": "",
    			"Series": "1500 (1/2 Ton)",
    			"Series2": "Conventional Cab",
    			"SteeringLocation": "",
    			"SuggestedVIN": "",
    			"TPMS": "",
    			"TopSpeedMPH": "",
    			"TrackWidth": "",
    			"TractionControl": "",
    			"TrailerBodyType": "Not Applicable",
    			"TrailerLength": "",
    			"TrailerType": "Not Applicable",
    			"TransmissionSpeeds": "",
    			"TransmissionStyle": "",
    			"Trim": "",
    			"Trim2": "",
    			"Turbo": "",
    			"VIN": "1GCEK14V0XZ138493",
    			"ValveTrainDesign": "",
    			"VehicleDescriptor": "1GCEK14V*XZ",
    			"VehicleType": "TRUCK",
    			"WheelBaseLong": "",
    			"WheelBaseShort": "",
    			"WheelBaseType": "",
    			"WheelSizeFront": "",
    			"WheelSizeRear": "",
    			"WheelieMitigation": "",
    			"Wheels": "",
    			"Windows": ""
    		}
    	]
    }

    On its own, this is pretty cool. But I wanted to do something more with this information. I wanted to store it, so I could use it later, for a Cloud World-related project.

    A GET and POST Python script

    So, with the help of one of the many popular LLMs I settled on a Python script that helps me take what you see above and get it “database ready.” Here is what that Python code looks like:

    import argparse
    import requests
    import json
    import re
    
    def sanitize_key(key: str) -> str:
        # Convert keys to lowercase, replace non-alphanumeric characters with underscores.
        key = key.strip()
        key = re.sub(r'[^a-zA-Z0-9]+', '_', key)
        return key.lower()
    
    def decode_vin(vin: str, model_year: int):
        url = f"https://vpic.nhtsa.dot.gov/api/vehicles/DecodeVinValues/{vin}?format=json&modelyear={model_year}"
        response = requests.get(url)
        response.raise_for_status()
    
        data = response.json()
    
        if "Results" not in data or not data["Results"]:
            raise ValueError("Invalid API response: No Results found.")
    
        raw_result = data["Results"][0]
    
        # Sanitize keys and skip empty values
        result = {
            sanitize_key(k): v
            for k, v in raw_result.items()
            if v not in (None, "", "Not Applicable")
        }
    
        return result
    
    def post_to_endpoint(payload: dict, endpoint: str):
        headers = {'Content-Type': 'application/json'}
        response = requests.post(endpoint, json=payload, headers=headers)
        response.raise_for_status()
        return response.json()
    
    def main():
        parser = argparse.ArgumentParser(description="A tool to decode VINs and send them to an Oracle REST endpoint")
        parser.add_argument("vin", type=str, help="Vehicle Identification Number")
        parser.add_argument("model_year", type=int, help="Model year of the vehicle")
        parser.add_argument("--pretty", action="store_true", help="Pretty-print output JSON")
        args = parser.parse_args()
    
        # Decode and sanitize
        result = decode_vin(args.vin, args.model_year)
    
        # Print to console
        if args.pretty:
            print(json.dumps(result, indent=2))
        else:
            print(result)
    
        # POST to local Oracle ORDS endpoint
        endpoint_url = "http://localhost:8080/ords/ordsdemo/vin_table/"
        try:
            post_response = post_to_endpoint(result, endpoint_url)
            print("\nPOST response from server:")
            print(json.dumps(post_response, indent=2))
        except requests.HTTPError as e:
            print(f"Error posting to endpoint: {e}")
    
    if __name__ == "__main__":
        main()

    ๐Ÿ‘ˆ๐Ÿผ Click here to learn what all different parts of this script do.

    I’ll explain briefly according to the code blocks:

    • Lines 6-10 take the results, or rather the “keys” (which will later become the columns in a database table), and standardize them. The keys end up being all lower case, no whitespaces.
    • Lines 12-31 are the bulk of the script. That part issues a GET request to the VIN decoder endpoint. If you can picture in your head, what I’m doing effectively is creating a table of keys and values. Those will then be sent over to my ORDS endpoint (a table I separately created and Auto-REST enabled with ORDS).
    • Lines 33-37 and 56-62 kind of work in tandem. That first block, lines 33-57, basically expects all the “stuff” you see in line 58. Line 58 sends the results (after we’ve processed it), using the ORDS endpoint, to the function in lines 33-57.
    • Lines 39-53 are my entry and exit points for this script. So on the command line I’d execute the following command (I’m using the same VIN and model year): python3 vindecode.py 1GCEK14V0XZ138493 1999

    Target 23ai Database

    So this whole exercise doesn’t work unless I have a target, in my database. And for this project I’m working on, a simple table that has been Auto-REST enabled is fine. I just need it to store these values. And I can use the VIN as a Primary Key (VINs are unique, and no two are alike).

    And if I’m thinking ahead, I’ll probably never update any single row. I just need to be able to insert a new row with a POST request, and then later issue a GET request by VIN. So an Auto-REST enabled table, with the VIN as a Primary Key will work just fine. As things scale, this might change, but with minimal technical debt incurred, this approach works for now.

    As an aside, when I’m working through projects like these, where I’m building out a new object, I usually have a general idea of what I want to achieve, but I will go through tons of iterations1. This time I ended up creating and dropping like four table versions before I ended on this one (four is on the low-end actually). The final DDL for this table:

    CREATE TABLE ORDSDEMO.VIN_TABLE 
        ( 
         VIN                         VARCHAR2 (17) , 
         MAKE                        VARCHAR2 (50) , 
         MAKEID                      NUMBER (5) , 
         MANUFACTURER                VARCHAR2 (100) , 
         MANUFACTURERID              NUMBER (10) , 
         MODEL                       VARCHAR2 (100) , 
         MODELID                     NUMBER (10) , 
         MODELYEAR                   NUMBER (4) , 
         SERIES                      VARCHAR2 (100) , 
         SERIES2                     VARCHAR2 (100) , 
         TRIM                        VARCHAR2 (100) , 
         TRIM2                       VARCHAR2 (100) , 
         VEHICLEDESCRIPTOR           VARCHAR2 (20) , 
         VEHICLETYPE                 VARCHAR2 (50) , 
         BODYCLASS                   VARCHAR2 (50) , 
         BODYCABTYPE                 VARCHAR2 (50) , 
         DRIVETYPE                   VARCHAR2 (50) , 
         BRAKESYSTEMTYPE             VARCHAR2 (50) , 
         BRAKESYSTEMDESC             VARCHAR2 (100) , 
         FUELINJECTIONTYPE           VARCHAR2 (100) , 
         FUELTYPEPRIMARY             VARCHAR2 (50) , 
         FUELTYPESECONDARY           VARCHAR2 (50) , 
         ENGINECONFIGURATION         VARCHAR2 (50) , 
         ENGINECYLINDERS             NUMBER (2) , 
         ENGINEHP                    NUMBER , 
         ENGINEKW                    NUMBER , 
         ENGINEMANUFACTURER          VARCHAR2 (100) , 
         ENGINEMODEL                 VARCHAR2 (100) , 
         DISPLACEMENTCC              NUMBER (6,1) , 
         DISPLACEMENTCI              NUMBER (8,5) , 
         DISPLACEMENTL               NUMBER (4,1) , 
         DOORS                       NUMBER (2) , 
         GVWR                        VARCHAR2 (100) , 
         PLANTCITY                   VARCHAR2 (50) , 
         PLANTSTATE                  VARCHAR2 (50) , 
         PLANTCOUNTRY                VARCHAR2 (50) , 
         PLANTCOMPANYNAME            VARCHAR2 (100) , 
         OTHERENGINEINFO             VARCHAR2 (255) , 
         ERRORTEXT                   VARCHAR2 (255) , 
         ERRORCODE                   NUMBER (2) , 
         AIRBAGSIDE                  VARCHAR2 (50) , 
         AIRBAGFRONT                 VARCHAR2 (50) , 
         AIRBAGCURTAIN               VARCHAR2 (50) , 
         TPMS                        VARCHAR2 (100) , 
         SEATBELTSALL                VARCHAR2 (50) , 
         ENTERTAINMENTSYSTEM         VARCHAR2 (100) , 
         CUSTOMMOTORCYCLETYPE        VARCHAR2 (100) , 
         TRAILERTYPE                 VARCHAR2 (100) , 
         TRAILERBODYTYPE             VARCHAR2 (100) , 
         TRANSMISSIONSTYLE           VARCHAR2 (100) , 
         TRANSMISSIONSPEEDS          VARCHAR2 (50) , 
         WINDOWS                     VARCHAR2 (50) , 
         WHEELS                      VARCHAR2 (50) , 
         WHEELBASESHORT              VARCHAR2 (50) , 
         WHEELBASELONG               VARCHAR2 (50) , 
         WHEELBASETYPE               VARCHAR2 (100) , 
         NOTE                        VARCHAR2 (255) , 
         ABS                         VARCHAR2 (10) , 
         ADAPTIVECRUISECONTROL       VARCHAR2 (50) , 
         ADAPTIVEHEADLIGHTS          VARCHAR2 (50) , 
         FORWARDCOLLISIONWARNING     VARCHAR2 (50) , 
         PEDESTRIANAUTOMATICEBRAKING VARCHAR2 (50) , 
         REARCROSSTRAFFICALERT       VARCHAR2 (50) , 
         REARAUTOMATICEBRAKING       VARCHAR2 (50) , 
         PARKASSIST                  VARCHAR2 (50) , 
         TRACTIONCONTROL             VARCHAR2 (50) , 
         ESC                         VARCHAR2 (50) , 
         DYNAMICBRAKESUPPORT         VARCHAR2 (50) , 
         LANEDEPARTUREWARNING        VARCHAR2 (50) , 
         LANEKEEPSYSTEM              VARCHAR2 (50) , 
         ACTIVESAFETYSYSNOTE         VARCHAR2 (255) , 
         ADDITIONALERRORTEXT         VARCHAR2 (255) 
        ) 
        TABLESPACE USERS 
        LOGGING 
    ;
    
    ALTER TABLE ORDSDEMO.VIN_TABLE 
        ADD PRIMARY KEY ( VIN ) 
        USING INDEX LOGGING ;

    Some background

    Oh, please know that I used an LLM for this DDL too. I could have reduced the number of columns since this particular vehicle has a lot of null or “n/a” fields. But other, future vehicles might actually have values for some of those columns. They can stay for now. You might not like the stylistic choice of condensing all my values, but I think this is inconsequential. You’d probably transform them again, in your application layer, when you pull them down from the database. Also, I’m familiar with these values. I could have abbreviated them even further, but for readability, I think this is good (for future me, and others too).

    ORDS Auto-REST

    The last thing I needed to do was Auto-REST enable my table. Which, if you’ve used the SQL Developer Web UI, you know how easy this is. Two clicks and you’re done. From there I have access to several HTTP methods; I only care about GET [all], GET{by id}, and POST.

    Oh, and if you look at the URI that is provided, it’s the same URI that you saw in line 56 of my Python script. So that sort of brings us full circle.

    If you recall, I used the VIN as a Primary Key. This image shows how that VIN can now be used in the URI (instead of an arbitrary Row ID).

    Review/Checkpoint

    So, at this point, I have two target endpoints:

    1. One where I retrieve a vehicle’s details by its VIN and model year (I don’t own this one – this is the one provided by NHTSA).
    2. Another (that I own) where I am updating and retrieving vehicle records.

    I’ve created a VIN_TABLE table, with the VIN as the Primary Key, and I’ve Auto-REST enabled it with ORDS. I’ve then taken the endpoint for issuing a POST request and added it to the Python script (saved as vindecode.py). Now I can issue the following command to add a record to my table:

    python3 vindecode.py 1GCEK14V0XZ138493 1999

    And just so you know I’m being honest, I’ll drop everything from my target table, and run through this workflow again.

    Back in VS Code, I’ll issue the command, and receive the response from my database (via ORDS).

    And finally, I’ll review the results (I already did this in VS Code, but I’ll do it in the browser and SQL Worksheet too):

    Review

    And that’s it! One of countless examples of how you might work through a pretty common pattern such as this (using LLMs too). The nice thing here is I don’t have to worry about any ORMs (never have, never will). You have to understand, my reality is such that ORMs do not exist. All I’ve ever known is ORDS; a bit of middleware that sends/receives HTTP/HTTPS requests on your behalf, without ever having to map a thing.

    And we’ve made it much easier to take ORDS for a test drive. We’ve been working behind the scenes to improve our Docker/Podman documentation; we just released the latest refresh to our docs here. Hopefully the docs are much more approachable for the container newbie.

    OH! And if you ever have any issues, open a ticket here. Our team now “owns” the ORDS content on that GitHub repository; so, we’ll be a lot more responsive too!

    That is all for now. If you haven’t downloaded the latest ORDS, check it out here. Bye ๐Ÿ‘‹๐Ÿผ.

    P.S. Yes, I realize this post’s image is of a GMT-400, and not a GMT-800. But it was a cool pic, and I really liked it. Credit to The Autopian.

    1. I’m including a link to the prompting I used for creating most of what you see here. I think in this new age of AI/LLMs it is important to understand prompting. Eventually, whether we like it or not, we’ll all need to become “prompt engineers.” And as you’ll see in that thread, you actually have to know what you are doing to be able to ask the right questions and to challenge an LLM’s assumptions.

      A lot of times, these things are just flat out wrong, and overly verbose too. As you’ll see, at one point the LLM added a mapping function in one of the iterations, which in my mind was completely unnecessary. Additionally, I wouldn’t have constructed this script as a pseudo-command line interface. BUT…I learned a new library (argparse), and I think I’ll explore that more in the future. โ†ฉ๏ธŽ
  • Build an ORDS API Resource Module, GET request with JavaScript fetch, display in HTML

    Recap

    This post is a continuation of a previous one, which can be found here. In this post, I’ll:

    If you are coming from the previous related post, then you’ll recall I used the following SQL query:

    My next step is to take this SQL and bring it to the REST Workshop, where I’ll turn it into an API.

    REST Workshop

    There are several ways you can navigate to the REST Workshop. Typically, I return to the Database Actions LaunchPad. From there, I select REST.

    The Handler code

    I've already created my Resource Module, Template, and Handler. I kept everything default, with no authentication enabled.

    The only thing I changed was the SQL query. I removed the final line, fetching the first 10 only. I want to be able to control the pagination of the API. If I were to keep that last line, this eventual endpoint would always only return the first 10 rows. And what if I want the next ten rows thereafter? Well, if I hard-code this, then I can’t really make that work. So, I chose to leave it open-ended.

    Technically, it is NOT open-ended because I retained the default pagination of 25. But, by removing that fetch first 10 rows condition, I can now fetch ALL rows that fit those parameters (in increments of 25).

    If I visit this new endpoint, it will appear like this:

    And if I collapse the items, you’ll see something that is EXTREMELY confusing. If I removed that fetch first 10 rows condition in the original SQL query, then why do we see a limit and offset of 10?

    The answer is because I actually set the Items Per Page equal to 10 (in the Resource Handler). This is the REST equivalent of a dirty joke. Consider yourself roasted…

    JavaScript

    With that endpoint live, I can take the API and drop it into some sample JavaScript and HTML code.

    JavaScript and HTML

    I learned a great deal about this JavaScript by reviewing this YouTube video. That is where I learned how to map through the items of my ORDS payload. And there was a refresher on JavaScript string interpolation (with template literals) too!

    PAUSE: Don't be too intimidated by string interpolation and template literals! Read the link I included, and take your time. If you are coming from Python, its similar to Jinja (when using Flask) and f-string literals ๐Ÿ™ƒ.

    You can see that I’m using the map() constructor to iterate through all the data in my JSON payload. Remember, this was the payload in the items portion of my endpoint!

    I believe the item in list.map((item) is a reference to an individual item inline 4’s data.items. The reason why I think this is because if I change the items in lines 7-10 in my JavaScript to something random, like the name bobby, things start to break:

    However, if I change everything back to item, and start the live server in VS Code, I’ll be met with the following rendering:

    That’s it, though. Combining the ORDS API, the Fetch API, simple JavaScript, and HTML will allow you to create this straightforward web page.

    Reviewing Inspector, Console, Network

    I also have a few more screenshots, one each for the HTML Inspector, Console Log, and Client/Server Network. All of these show what is happening under the covers but in different contexts.

    Inspector

    In the Inspector, you can see how the JavaScript map() constructor plus the document.querySelector() in line 18 of the JavaScript code work in tandem with line 12 of the HTML script to display contents on the page:

    Console

    Here, you can see the items in the Console. This is because we added console.log(item)in line 19 of the JavaScript code.

    Network

    Finally, you can see the 200 GET request from our ORDS API. Then, on the far right of the screen, you can see the JSON payload coming from that same ORDS endpoint.

    Admittedly, the way the “Cast” is displayed is not correct. That is yet another array of cast members. And I’ve yet to learn how to structure that correctly. So, if you are reading this, and you know, let me know!

    Finally, all the code you’ve seen in this post can be found in my moviestreamjs github repository.

    That’s all for now!

    Follow

    And don’t forget to follow, like, subscribe, share, taunt, troll, or stalk me!

  • Tinkering: a SQL script for the ORDS_ADMIN.ENABLE_SCHEMA procedure

    Tinkering: a SQL script for the ORDS_ADMIN.ENABLE_SCHEMA procedure

    Post-ORDS installation

    Once you’ve installed ORDS, you need to REST-enable your schema before taking advantage of ORDS (I used to forget this step, but now it’s like second nature).

    RESOURCES: I've discussed ORDS installation here and here. I'd check both pages if you're unfamiliar with it or want a refresher. 

    ORDS.ENABLE_SCHEMA / ADMIN_ORDS.ENABLE_SCHEMA

    While logged into your database with SQLcl, you can issue the following commands to not only create a new user but grant them the required Roles (and the underlying Privileges) and REST-enable their schema (aka “ORDS-ifying” a schema):

    /* Remember to remove the brackets when you run this code */ 
    
    Create User [username] Identified By [password];
    Grant Connect to [username];
    Grant Resource to [username];
    
    Grant Unlimited Tablespace to [newuser];
    
    /* This PL/SQL procedure assumes you are logged in as the SYS.
    If you are logged in as that new user, AND HAVE BEEN GRANTED
    THE DBA ROLE, then you can execute the ORDS.ENABLE_SCHEMA 
    procedure */
    
    Begin
        ORDS_ADMIN.ENABLE_SCHEMA(p_schema = > '[username]');
    End;
    /
    Create User [username] Identified By [password];
    Grant Connect to [username];
    Grant Resource to [username];
    
    Grant Unlimited Tablespace to [newuser];
    
    Begin
        ORDS_ADMIN.ENABLE_SCHEMA(p_schema = > '[username]');
        Commit;
    End;
    /

    Automate because lazy

    But even that is too much work for me, so I took a stab at automating this via an SQL script. As you can see, the above commands are simple and repeatable (a perfect candidate for automation). And since I’m constantly adding and dropping users for various reasons, copying and pasting code from an old blog or writing everything by hand gets annoying. Additional reasons for automating:

    1. laziness
    2. a desire to improve SQL and PL/SQL skills
    3. an interest in scripting
    4. I get easily distracted

    The script

    After about a day and a half, I have a working prototype script to call upon when I’m on the SQLcl command line. Here is what I came up with:

    NOTE: If you just came here to remix the code, I have this in the scripts folder in my GitHub blog repo as well. Please feel free to sample it and/or roast it/me ๐Ÿ”ฅ. But if you keep scrolling, I'll go into more detail section-by-section.
    18-OCT-2023 UPDATE: I've slightly changed this code to include the Commit; command in the PL/SQL portion of the script. You'll see that reflected in line 44. Thanks Renรฉ ๐Ÿ™Œ๐Ÿป!
    INPUT
    PROMPT Choose a new database username:
    ACCEPT NEWUSER CHAR PROMPT 'Enter new user name hurrr:'
    PROMPT Choose a temporary password for &&NEWUSER: 
    ACCEPT NEWPASS CHAR PROMPT 'Make it super secret:'
    
    /*
    I wish I could figure out a way to ONLY ask for username > check 
    that against existing database users > AND THEN allow a user to 
    continue with the 'choose password' step. I was only able to figure 
    out how to ask for the username and password AND THEN checks 
    against the database. I stole the code from this thread: 
    https://stackoverflow.com/questions/30710990/creating-an-oracle-user-if-it-doesnt-already-exist 
    
    Currently, its just extra steps for the user; kind of annoying. If you're 
    reading this and can figure out a way to get this working, let me know! 
    I'll make the change and attribute you in the comments :) 
    */
    
    Set Verify On 
    
    /*
    You can refer to section 6.3.10.11 for more details on this 
    SET VERIFY OFF command 
    https://docs.oracle.com/en/database/oracle/oracle-database/23/sqpug/using-substitution-variables-sqlplus.html#GUID-3ACD41F3-A5A2-48D5-8E81-C29F9C14C865
    */
    
    /*
    The difference between using single and double ampersands: 
    https://docs.oracle.com/en/database/oracle/oracle-database/23/sqpug/using-substitution-variables-sqlplus.html#GUID-C6BE6E41-821F-413E-B4B1-56AAE4A46298
    */
    
    Declare
    check_if_user_exists Integer;
    plsql_block VARCHAR2(500);
    NEWUSER VARCHAR2(20) := '&&NEWUSER';
    Begin
      Select count(*) Into check_if_user_exists From dba_users Where username=NEWUSER;
      If (check_if_user_exists = 0) Then
      Execute Immediate 'Create User &&NEWUSER Identified By &&NEWPASS';
      Execute Immediate 'Grant Connect To &&NEWUSER';
      Execute Immediate 'Grant Resource To &&NEWUSER';
      Execute Immediate 'Grant Unlimited Tablespace To &&NEWUSER';
      plsql_block := 'Begin ORDS_ADMIN.ENABLE_SCHEMA(p_schema => :1); Commit; End;';
      Execute Immediate plsql_block using NEWUSER;
      End If;
    End;
    /
    
    /*
    The p_schema parameter is mandatory, that's why I'm including it. 
    If you omit the other parameters, the procedure will use the default 
    parameter values. 
    
    Learn more about this procedure here: 
    https://docs.oracle.com/en/database/oracle/oracle-rest-data-services/23.2/orddg/oracle-rest-data-services-administration-pl-sql-package-reference.html#GUID-459B8B6F-16EC-4FEC-9969-E8231668AD85
    
    I was able to get this entire thing to work through trial-and-error, 
    while also using this for reference: 
    https://docs.oracle.com/cd/B13789_01/appdev.101/b10807/13_elems017.htm
    */
    
    PROMPT
    PROMPT
    PROMPT Congrats ๐ŸŽ‰ the user: &&NEWUSER, with the password: &&NEWPASS is now a bona fide database user ๐Ÿ™Œ๐Ÿป! 
    PROMPT Not only that, &&NEWUSER can log into Database Actions and REST-Enable their database objects too ๐Ÿ˜!
    PROMPT
    PROMPT
    PROMPT Click RETURN to return to the SQLcl prompt. And NEVER forget:
    PAUSE "You're good enough, you're smart enough, and doggone it, people like you!"
    PROPS: I owe much credit to this StackOverflow post and Jon Heller's and Mark Bobak's comments.

    Breaking it down

    Starting up podman

    I’ll first start up podman and cd in the correct directory. “Correct” insomuch that this is where my ordsuserl.sql script lives.

    starting-up-podman-machine-chris-hoina-senior-product-manager-ords-oracle-rest-apis-database-tools
    ordsuser.sql, the script I’ll be working with.

    Once my container status displays healthy, I’ll execute the ords serve command. This will launch ORDS in standalone mode (using the embedded Jetty server). I’ll use ORDS in a few minutes, so I should prepare it now.

    Prompting the user

    I’ll then log into my podman container using the following string:

    sql sys/password1234@//localhost:41465/ORCLPDB1 as sysdba
    sys-connection-string--chris-hoina-senior-product-manager-ords-oracle-rest-apis-database-tools-podman
    But why not the podman exec command?
    NOTE: I'm logging in as SYS (I've since been informed you shouldn't really be doing this as SYS; in fact, it looks like you shouldn't be doing this as SYSTEM. Tom has some good comments in this post here. 

    Why am I not using the podman exec command here?

    If you’ve seen my latest post on podman ports and networking, this command contradicts that entire article. There are actually two ways (maybe more, if I’m unaware) you can connect to your Oracle database in a podman container. The first way is to simultaneously hop on over to the container and sign in from inside that container. Your connection string would look like this:

    podman exec -it 21entdb sql sys/password1234@//localhost:1521/ORCLPDB1 as sysdba

    The second option is to sign in remotely like I’m doing in this current example:

    sql system/password1234@//localhost:41465/ORCLPDB1

    This is analogous to when you SSH into a machine remotely. I should have mentioned it in this recent YUM/Oracle Linux post. Still, when you create a Compute Instance, you can later SSH into that Instance and perform actions like you usually would in the Terminal or the Command Prompt. But instead of being on your own machine, you’re on a remote machine. Ports are very confusing (for me, at least), so please read that podman ports post.

    When you use the exec command, consider yourself on another computer on the Linux operating system. Once there, you must log in using the 1521 port because that is where the database’s TNS Listener (deep dive on Oracle database connection) is. However, when you are outside that container (i.e., that machine’s Linux OS), you need to use your local port (in this case, 41465) because it essentially acts as a proxy or a pass-through to the container’s 1521 port. Savvy ๐Ÿดโ€โ˜ ๏ธ?

    DISCLAIMER: This my best-effort attempt at explaining this confusing concept. It is subject to change. But I really want people to take advantage of our tools in the Oracle Container Registry, so I hope this helps! 

    Hath connected to the database

    Once I’m in, I can call upon my script to quickly create a new user and REST-enable their schema (recall, I “cd” into the correct directory in an earlier step). The syntax:

    @ordsuser.sql
    MORE SQLcl: You can read the different SQL, SQLcl, and PL/SQL commands here.

    After pressing Return/Enter, a prompt will appear; this is what it looks like on the “front end”:

    reviewing-top-of-output-in-terminal-chris-hoina-senior-product-manager-ords-oracle-rest-apis-database-tools-podman
    Notice the prompts in those first four lines.
    HINT: I would also spend some time here, learning about the PROMPT, ACCEPT, and INPUT commands. Yes, this comes from the SQL*Plus documentation, but this works as you'd expect in SQLcl.

    Meanwhile, here is the corresponding section in the script:

    initial-input-section-of-script-vs-code-chris-hoina-senior-product-manager-ords-oracle-rest-apis-database-tools-podman
    The first section of the SQL script.

    Once I enter the password and press Enter/Return on my keyboard, the rest of the script is automatically executed using the provided username and password as substitution variables for the rest of the script! There is an in-depth explanation in the docs here, but you should grab the code and tinker with it to see how everything interacts and works together.

    NOTE: The Set Verify On command displays the changes made from the original PL/SQL procedure and the updated PL/SQL procedure (with the updated username and password). It isn't necessary, but I wanted to provide some feedback to a user. 

    PL/SQL procedure

    Assuming the user (i.e., the one you selected) doesn’t exist, the PL/SQL procedure should execute without issues. In the following image, you can see what is happening in real time:

    1. A new user is created with the assigned username and password
    2.  That user is granted the Connect and Resource roles
    3.  The schema is then REST-enabled using the ORDS_ADMIN.ENABLE_SCHEMA PL/SQL procedure
    reviewing-the-verify-on-plsql-procedure-chris-hoina-senior-product-manager-ords-oracle-rest-apis-database-tools-podman

    Why NEWUSER in the Declare block?

    Yeah, good question. At a minimum, you need to include the p_schema in the ORDS_ADMIN.ENABLE_SCHEMA procedure. Read up on that hereBut, I wanted the username to remain in lowercase since that will later be used for my schema’s URIs (e.g., http://localhost:8080/ords/ordstest/emp/).

    So I quickly taught myself/attempted a crash course on define variables and bind arguments; most of what I learned came from this EXECUTE IMMEDIATE documentation. And that’s why you see this in lines 25 and 34 of the code:

    NEWUSER VARCHAR2(20) := '&&NEWUSER'; 
    
    /* as well as */
    
    ORDS_ADMIN.ENABLE_SCHEMA(p.schema => :1);
    plsql-section-of-script-vs-code-chris-hoina-senior-product-manager-ords-oracle-rest-apis-database-tools-podman
    You can see the interaction between lines 25 and 34.

    And on the front end, courtesy of the Set Verify On command, you’ll see that updated block of code:

    The username and password fields are updated.

    Feedback

    I’ll then provide myself (or the user) with helpful feedback (along with words of encouragement). Once satisfied, I can click the Return/Enter key to exit the script and sign in to Database Actions as that new user.

    The corresponding section in the script looks like this:

    I’m cheating by adding PROMPT to give line breaks.

    Sign-in to Database Actions

    Now, I can navigate to the Database Actions sign-in page at localhost:8080/ords/sql-developer. If I wanted to, I could also navigate to the newly introduced landing page at http://localhost:8080/ords/_/landing (obviously, depending on your deployment, this address will differ).

    SQL Worksheet then the proof

    I’ll head to a SQL Worksheet, select All Objects in the Navigator tab, and a clean schema ready to take on the world!

    Summary

    And this marks the end of today’s lesson. So what did we learn?

    1. You can execute SQL scripts directly from the SQLcl command line.
    2. My script is cool, but I wish I could verify if a user exists sooner (I end up forcing the user to go through that password step).
    3. The script has no exception handling (I’m giving myself a pass on this since I’m just now getting into PL/SQL).
    4. Spend some time with the documentation on one screen and your script on another. After some time, you can actually understand how everything interacts.

    One final thought. My process required a lot of trial and error, but seeing how everything flows and works is entertaining.

    Please sample/remix my code; it’s located in my GitHub blog repo. Make it better, and let me know what you come up with! And be sure to check out ORDS and SQLcl ๐Ÿ˜!

    Follow

    And don’t forget to follow, like, subscribe, share, taunt, troll, or stalk me!

  • HELP!! parse error: Invalid numeric literal at line x, column x?! It’s not your Oracle REST API!!

    HELP!! parse error: Invalid numeric literal at line x, column x?! It’s not your Oracle REST API!!

    A while back (yesterday), I penned a blog post highlighting the ORDS REST-Enabled SQL Service. And in that blog, I displayed the output of a cURL command. A cURL command I issued to an ORDS REST-Enabled SQL Service endpoint. Unfortunately, it was very messy and very unreadable. I mentioned that I would fix it later. Well…it’s now…later (temporal paradox, anybody ๐Ÿคจ?).

    Recap

    If you recall, the output of my POST request looked like this:

    crap-response-from-curl-command-post-request-to-rest-enabled-sql-endpoint-chris-hoina-senior-product-manager-database-actions-oracle
    Yikes, you kiss your mother with that mouth?!

    JSON is not displaying correctly

    Well, the reason why I didn’t originally pipe in the json_pp command is because this is what happened when I attempted it:

    attempting-to-use-the-json-pp-command-for-post-request-curl-chris-hoina-senior-product-manager-database-actions-oracle

    Jefe to the rescue

    After reading my newly published article, Jefe suggested I try the jq command.

    jeff-sage-advice-on-slack-ords-post-chris-hoina-senior-product-manager-database-actions-oracle
    The Yoda to my Padawan

    Which, of course, I did. Still no luck:

    jeffs-suggestion-for-jq-command-for-post-request-curl-chris-hoina-senior-product-manager-database-actions-oracle
    Different issue though

    Andiamo a googliare!

    Online search to the rescue

    Search online using the keywords “parse error: Invalid numeric literal at,” and you’ll quickly discover that you’re not the only one with this problem.

    Five minutes of research revealed a potential culprit. What I was experiencing seemed to be a known issue. For example, a long-standing jq bug on GitHub details this exact scenario. This doesn’t seem to be a jq or json_pp issue. Instead, the problem is somehow related to the -i cURL command option and JSON parsing.

    After another few minutes, as luck would have it, I found a Stack Overflow thread discussing the same issue I encountered! After scrolling to the bottom of the thread, I found this golden nugget:

    removing-header-information-from-curl-request-for-ords-post-chris-hoina-senior-product-manager-database-actions-oracle
    Thank you Mattias and nhs503 ๐Ÿฅฐ

    Testing without -i

    So, I did just what Mattias and nhs503 suggested. I removed the -i option (-i, or –include) from my cURL command, and wouldn’t you know? The damn thing works as expected! I tested while piping jq and json_pp. I also concede that jq is the prettier of the two; I appreciate the colors (although, admittedly, this would NOT pass any accessibility testing).

    The modified commands used:

    curl -X POST --user ORDSTEST:password1234 --data-binary "@sportCountryMatrix.sql" -H "Content-Type: application/sql" -k http://localhost:8080/ords/ordstest/_/sql | jq
    curl -X POST --user ORDSTEST:password1234 --data-binary "@sportCountryMatrix.sql" -H "Content-Type: application/sql" -k http://localhost:8080/ords/ordstest/_/sql | json_pp

    And the results:

    Final thoughts

    And for some final thoughts…

    1. It turns out it’s NOT ORDS – it’s something to do with an underlying JSON parser not liking the header info that is coming through
    2. json_pp and jq both work; they output the information in different order
    3. The ORDS REST-Enabled SQL Service returns to you not only your results, but the SQL statement initially used (that is cool and I didn’t originally realize or mention this)

    And that’s it for this one! I really hope you find this useful. I hope this saves you some time from having to troubleshoot and/or hunt for a fix for this tricky problem. That’s all for now!

    Follow

    And don’t forget to follow, like, subscribe, share, taunt, troll, or stalk me!

  • A quick ORDS REST-Enabled SQL Service example

    A quick ORDS REST-Enabled SQL Service example

    I promise this post will connect back to an overarching theme. But for now, I want to show how you can take a SQL query and use that in combination with the ORDS REST-Enabled SQL Service to request data from a database table.

    The SQL query

    Here is the SQL query I’m using:

    select * from (     
      select noc, sport     
      from   olympic_medal_winners     
    )     
    pivot (min('X') for sport in (     
      'Archery' as arc, 'Athletics' as ath, 'Hockey' as hoc,   
      'Judo' as jud, 'Sailing' as sai, 'Wrestling' as wre  
      )      
    )     
    order  by noc     
    fetch  first 7 rows only

    The SQL Script

    Please feel free to cheat like me and steal this same script from the Live SQL site (direct link here). And if you can’t be bothered to do that, the script in its entirety, can be found at the bottom of the post.

    PAUSE: Shout out to Chris Saxon for conceiving this. And putting in the real work. I'm both too lazy and too dumb to come up with this on my own.

    The demo

    Let’s assume you’ve created the table and inserted all the same data. Now, you can take a SQL query (use the same one as me, or don’t, I don’t care ๐Ÿ˜‘) and run it in an SQL Worksheet (like I did here):

    running-the-sql-query-in-database-actions-chris-hoina-ords-oracle-rest-apis-database-actions-rest-enabled-sql
    Coooool dude...you did a thing I already know how to do!

    That’s what you’ll end up with. It’s actually a pretty neat printout; I didn’t even know you could do this! But, I want to take that SQL query and demonstrate how you can do this with the ORDS REST-Enabled SQL service.

    Disclaimer

    I’m performing this demo locally. I have a Podman container running with an Oracle database therein (one I grabbed from our Oracle Container Registry). I’ve also installed ORDS in this database and used my ORDSTEST user (the same setup as in my original Podman/ORDS how-to post).

    About the REST-Enabled SQL Service

    How do I set this service up? When you first install ORDS, if you enable Database Actions, you are also enabling this REST-Enabled SQL Service. You’ll see it in this step in the ORDS Interactive Installer:

    Enter a number to select additional feature(s) to enable:
        [1] Database Actions  (Enables all features)
        [2] REST Enabled SQL and Database API
        [3] REST Enabled SQL
        [4] Database API
        [5] None
      Choose [1]:

    See it? If you select [1], then you are also enabling features [2], [3], and [4].

    PRO TIP: Basically, if you can sign into Database Actions, then you're good. You're all set.

    Long story short. You are taking that whacky SQL from the above example (something that would take me a week to come up with on my own) and passing it as a payload in a POST request to your REST-Enabled SQL Service ๐Ÿค– endpoint.

    Since I am doing this locally, my REST-Enabled SQL Service endpoint looks like this:

    rest-enabled-sql-service-endpoint-post-chris-hoina-senior-product-manager-ords-oracle-database-tools
    You should see something similar; your REST-enabled schema alias will differ however

    Next, with my Terminal open (and ORDS running, duh!) I’ll issue the following command:

    curl -i -X POST --user ORDSTEST:password1234 --data-binary "@sportCountryMatrix.sql" -H "Content-Type: application/sql" -k http://localhost:8080/ords/ordstest/_/sql 
    issuing-curl-command-post-request-to-rest-enabled-sql-endpoint-chris-hoina-senior-product-manager-database-actions-oracle

    The --data-binary option

    Ah-ha! You probably noticed that the --data-binary option references a separate sportCountryMatrix.sql file. That’s because our docs recommend using an SQL file for multi-line SQL statements (like my example). I bet you could pass in this multi-line statement via the command line, but that seems unnecessarily challenging. Plus, I KNOW this works.

    A quick review of the directory setup

    I want to quickly review how I created this file, mostly remaining in Terminal. I first created a new ordsSqlService directory on my desktop. Then, I made an empty sportCountryMatrix.sql file.

    Using VIM, I opened that file, pasted my choice SQL statement, saved it, and exited. I wanted to mention this because when I executed that cURL command, it worked because I was in the same directory as the SQL file!

    Here are some screenshots of me going through those steps. You’ll see me creating the file but then also using the cat command so that I can double-check the contents of the .sql file.

    And now, back to the cURL command. After issuing the command, here is the response to the POST request:

    response-from-curl-command-post-request-to-rest-enabled-sql-endpoint-chris-hoina-senior-product-manager-database-actions-oracle

    I know the response isn’t the most readable, but I can figure that out another time. (I have something else planned as a follow-up to this post). But it’s all there, trust me (I’m a doctor)!

    Also, this blog post was about 50% me messing around and 50% reminding YOU that ORDS is capable of this (right out of the box, with the correct switches turned on). So, hopefully, you get the gist ๐Ÿ˜„.

    Takeaways

    Let me close this out with some final thoughts…

    1. If you can sign into Database Actions, then you can take advantage of the REST-Enabled SQL Service
    2. I haven’t explored how to pretty print the JSON response so it is more readable (and yes, I tried piping in | json_pp; it didn’t work)
    3. Update on #2…I actually did figure this out. Read about that here.
    4. You have to use your database username and password; this isn’t ideal for two reasons:
      • Security
      • Resource consumption (Basic Authentication can become costly, quick)
    5. You can take pretty much any SQL query and turn it into a “Resource”
      • For instance, if you are an analyst, you can take that SQL query, save it as a file, and pass it in your cURL command to get precisely what you want.

    Follow

    And don’t forget to follow, like, subscribe, share, taunt, troll, or stalk me!

    -- REM   Script: Pivot and unpivot examples using Olympic data
    -- REM   Examples of pivoting and unpivoting data. Uses a subset of -- the results from the Rio Olympics as a data source.
    
    -- For further explanation of the scripts, read the following blog -- post:
    -- https://blogs.oracle.com/sql/entry/how_to_convert_rows_to
    
    create table olympic_medal_winners (   
      olympic_year int,  
      sport        varchar2( 30 ),  
      gender       varchar2( 1 ),  
      event        varchar2( 128 ),  
      medal        varchar2( 10 ),  
      noc          varchar2( 3 ),  
      athlete      varchar2( 128 ) 
    );
    
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Archery','M','Men''s Individual','Gold','KOR','KU Bonchan');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Archery','M','Men''s Individual','Silver','FRA','VALLADONT Jean-Charles');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Archery','M','Men''s Individual','Bronze','USA','ELLISON Brady');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Archery','M','Men''s Team','Gold','KOR','Republic of Korea');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Archery','M','Men''s Team','Bronze','AUS','Australia');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Archery','M','Men''s Team','Silver','USA','United States');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Artistic Gymnastics','M','Men''s Floor Exercise','Gold','GBR','WHITLOCK Max');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Artistic Gymnastics','M','Men''s Floor Exercise','Bronze','BRA','MARIANO Arthur');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Artistic Gymnastics','M','Men''s Floor Exercise','Silver','BRA','HYPOLITO Diego');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Artistic Gymnastics','M','Men''s Horizontal Bar','Gold','GER','HAMBUECHEN Fabian');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Artistic Gymnastics','M','Men''s Horizontal Bar','Bronze','GBR','WILSON Nile');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Artistic Gymnastics','M','Men''s Horizontal Bar','Silver','USA','LEYVA Danell');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Athletics','M','Men''s 10,000m','Gold','GBR','FARAH Mohamed');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Athletics','M','Men''s 10,000m','Bronze','ETH','TOLA Tamirat');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Athletics','M','Men''s 10,000m','Silver','KEN','TANUI Paul Kipngetich');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Athletics','M','Men''s 100m','Gold','JAM','BOLT Usain');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Athletics','M','Men''s 100m','Silver','USA','GATLIN Justin');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Athletics','M','Men''s 100m','Bronze','CAN','DE GRASSE Andre');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Badminton','M','Men''s Doubles','Gold','CHN','Zhang');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Badminton','M','Men''s Doubles','Bronze','GBR','Langridge');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Badminton','M','Men''s Doubles','Bronze','GBR','Ellis');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Badminton','M','Men''s Doubles','Silver','MAS','Tan');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Badminton','M','Men''s Doubles','Silver','MAS','Goh');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Badminton','M','Men''s Doubles','Gold','CHN','Fu');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Beach Volleyball','M','Men','Gold','BRA','Cerutti');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Beach Volleyball','M','Men','Gold','BRA','Oscar Schmidt');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Beach Volleyball','M','Men','Silver','ITA','Nicolai');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Beach Volleyball','M','Men','Silver','ITA','Lupo');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Beach Volleyball','M','Men','Bronze','NED','Meeuwsen');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Beach Volleyball','M','Men','Bronze','NED','Brouwer');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Boxing','M','Men''s Bantam (56kg)','Gold','CUB','RAMIREZ Robeisy');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Boxing','M','Men''s Bantam (56kg)','Bronze','UZB','AKHMADALIEV Murodjon');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Boxing','M','Men''s Bantam (56kg)','Bronze','RUS','NIKITIN Vladimir');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Boxing','M','Men''s Bantam (56kg)','Silver','USA','STEVENSON Shakur');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Boxing','M','Men''s Fly (52kg)','Gold','UZB','ZOIROV Shakhobidin');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Boxing','M','Men''s Fly (52kg)','Bronze','CHN','HU Jianguan');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Slalom','M','Canoe Double (C2) Men','Gold','SVK','PETER Skantar');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Slalom','M','Canoe Double (C2) Men','Bronze','FRA','GAUTHIER Klauss');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Slalom','M','Canoe Double (C2) Men','Bronze','FRA','MATTHIEU Peche');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Slalom','M','Canoe Double (C2) Men','Silver','GBR','RICHARD Hounslow');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Slalom','M','Canoe Double (C2) Men','Silver','GBR','DAVID Florence');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Slalom','M','Canoe Double (C2) Men','Gold','SVK','LADISLAV Skantar');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Sprint','M','Men''s Canoe Double 1000m','Gold','GER','Brendel');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Sprint','M','Men''s Canoe Double 1000m','Bronze','UKR','Mishchuk');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Sprint','M','Men''s Canoe Double 1000m','Bronze','UKR','Ianchuk');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Sprint','M','Men''s Canoe Double 1000m','Silver','BRA','Queiroz dos Santos');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Sprint','M','Men''s Canoe Double 1000m','Silver','BRA','de Souza Silva');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Canoe Sprint','M','Men''s Canoe Double 1000m','Gold','GER','Vandrey');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Road','M','Men''s Individual Time Trial','Gold','SUI','CANCELLARA Fabian');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Road','M','Men''s Individual Time Trial','Bronze','GBR','FROOME Christopher');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Road','M','Men''s Individual Time Trial','Silver','NED','DUMOULIN Tom');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Road','M','Men''s Road Race','Gold','BEL','VAN AVERMAET Greg');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Road','M','Men''s Road Race','Silver','DEN','FUGLSANG Jakob');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Road','M','Men''s Road Race','Bronze','POL','MAJKA Rafal');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Track','M','Men''s Keirin','Gold','GBR','KENNY Jason');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Track','M','Men''s Keirin','Bronze','MAS','AWANG Azizulhasni');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Track','M','Men''s Keirin','Silver','NED','BUCHLI Matthijs');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Track','M','Men''s Omnium','Gold','ITA','VIVIANI Elia');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Track','M','Men''s Omnium','Bronze','DEN','HANSEN Lasse Norman');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Cycling Track','M','Men''s Omnium','Silver','GBR','CAVENDISH Mark');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Diving','M','Men''s 10m Platform','Gold','CHN','CHEN Aisen');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Diving','M','Men''s 10m Platform','Bronze','USA','BOUDIA David');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Diving','M','Men''s 10m Platform','Silver','MEX','SANCHEZ German');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Diving','M','Men''s 3m Springboard','Gold','CHN','CAO Yuan');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Diving','M','Men''s 3m Springboard','Silver','GBR','LAUGHER Jack');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Diving','M','Men''s 3m Springboard','Bronze','GER','HAUSDING Patrick');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Equestrian','X','Dressage Individual','Gold','GBR','DUJARDIN Charlotte');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Equestrian','X','Dressage Individual','Bronze','GER','BRORING-SPREHE Kristina');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Equestrian','X','Dressage Individual','Silver','GER','WERTH Isabell');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Equestrian','X','Dressage Team','Gold','GER','Germany');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Equestrian','X','Dressage Team','Bronze','USA','United States');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Equestrian','X','Dressage Team','Silver','GBR','Great Britain');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Fencing','M','Men''s Foil Individual','Gold','ITA','GAROZZO Daniele');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Fencing','M','Men''s Foil Individual','Silver','USA','MASSIALAS Alexander');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Fencing','M','Men''s Foil Individual','Bronze','RUS','SAFIN Timur');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Fencing','M','Men''s Foil Team','Gold','RUS','Russian Federation');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Fencing','M','Men''s Foil Team','Bronze','USA','United States');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Fencing','M','Men''s Foil Team','Silver','FRA','France');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Handball','M','Men','Gold','DEN','Denmark');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Handball','M','Men','Silver','FRA','France');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Handball','M','Men','Bronze','GER','Germany');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Handball','W','Women','Gold','RUS','Russian Federation');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Handball','W','Women','Silver','FRA','France');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Handball','W','Women','Bronze','NOR','Norway');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Hockey','M','Men','Gold','ARG','Argentina');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Hockey','M','Men','Silver','BEL','Belgium');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Hockey','M','Men','Bronze','GER','Germany');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Hockey','W','Women','Gold','GBR','Great Britain');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Hockey','W','Women','Silver','NED','Netherlands');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Hockey','W','Women','Bronze','GER','Germany');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Judo','M','Men +100 kg','Gold','FRA','RINER Teddy');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Judo','M','Men +100 kg','Bronze','BRA','SILVA Rafael');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Judo','M','Men +100 kg','Bronze','ISR','SASSON Or');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Judo','M','Men +100 kg','Silver','JPN','HARASAWA Hisayoshi');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Judo','M','Men -100 kg','Gold','CZE','KRPALEK Lukas');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Judo','M','Men -100 kg','Bronze','FRA','MARET Cyrille');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Modern Pentathlon','M','Men''s Individual','Gold','RUS','LESUN Alexander');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Modern Pentathlon','M','Men''s Individual','Silver','UKR','TYMOSHCHENKO Pavlo');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Modern Pentathlon','M','Men''s Individual','Bronze','MEX','HERNANDEZ USCANGA Ismael Marcelo');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Modern Pentathlon','W','Women''s Individual','Gold','AUS','ESPOSITO Chloe');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Modern Pentathlon','W','Women''s Individual','Silver','FRA','CLOUVEL Elodie');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Modern Pentathlon','W','Women''s Individual','Bronze','POL','NOWACKA Oktawia');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rhythmic Gymnastics','W','Group All-Around','Gold','RUS','Russian Federation');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rhythmic Gymnastics','W','Group All-Around','Bronze','BUL','Bulgaria');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rhythmic Gymnastics','W','Group All-Around','Silver','ESP','Spain');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rhythmic Gymnastics','W','Individual All-Around','Gold','RUS','MAMUN Margarita');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rhythmic Gymnastics','W','Individual All-Around','Silver','RUS','KUDRYAVTSEVA Yana');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rhythmic Gymnastics','W','Individual All-Around','Bronze','UKR','RIZATDINOVA Ganna');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rowing','M','Lightweight Men''s Double Sculls','Gold','FRA','Azou');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rowing','M','Lightweight Men''s Double Sculls','Bronze','NOR','Brun');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rowing','M','Lightweight Men''s Double Sculls','Bronze','NOR','Strandli');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rowing','M','Lightweight Men''s Double Sculls','Silver','IRL','O''Donovan');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rowing','M','Lightweight Men''s Double Sculls','Silver','IRL','O''Donovan');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Rowing','M','Lightweight Men''s Double Sculls','Gold','FRA','Houin');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Sailing','M','470 Men','Gold','CRO','Fantela');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Sailing','M','470 Men','Bronze','GRE','Kagialis');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Sailing','M','470 Men','Bronze','GRE','Mantis');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Sailing','M','470 Men','Silver','AUS','Ryan');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Sailing','M','470 Men','Silver','AUS','Belcher');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Sailing','M','470 Men','Gold','CRO','Marenic');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Swimming','M','Men''s 100m Backstroke','Gold','USA','MURPHY Ryan');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Swimming','M','Men''s 100m Backstroke','Bronze','USA','PLUMMER David');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Swimming','M','Men''s 100m Backstroke','Silver','CHN','XU Jiayu');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Swimming','M','Men''s 100m Breaststroke','Gold','GBR','PEATY Adam');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Swimming','M','Men''s 100m Breaststroke','Bronze','USA','MILLER Cody');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Swimming','M','Men''s 100m Breaststroke','Silver','RSA','VAN DER BURGH Cameron');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Table Tennis','M','Men''s Singles','Gold','CHN','MA Long');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Table Tennis','M','Men''s Singles','Bronze','JPN','MIZUTANI Jun');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Table Tennis','M','Men''s Singles','Silver','CHN','ZHANG Jike');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Table Tennis','M','Men''s Team','Gold','CHN','China');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Table Tennis','M','Men''s Team','Bronze','GER','Germany');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Table Tennis','M','Men''s Team','Silver','JPN','Japan');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Taekwondo','M','Men +80kg','Gold','AZE','ISAEV Radik');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Taekwondo','M','Men +80kg','Bronze','KOR','CHA Dongmin');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Taekwondo','M','Men +80kg','Bronze','BRA','SIQUEIRA Maicon');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Taekwondo','M','Men +80kg','Silver','NIG','ISSOUFOU ALFAGA Abdoulrazak');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Taekwondo','M','Men -58kg','Gold','CHN','ZHAO Shuai');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Taekwondo','M','Men -58kg','Silver','THA','HANPRAB Tawin');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Tennis','M','Men''s Doubles','Gold','ESP','Lopez');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Tennis','M','Men''s Doubles','Bronze','USA','Johnson');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Tennis','M','Men''s Doubles','Bronze','USA','Sock');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Tennis','M','Men''s Doubles','Silver','ROU','Tecau');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Tennis','M','Men''s Doubles','Silver','ROU','Mergea');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Tennis','M','Men''s Doubles','Gold','ESP','Nadal');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Trampoline Gymnastics','M','Men','Gold','BLR','HANCHAROU Uladzislau');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Trampoline Gymnastics','M','Men','Silver','CHN','DONG Dong');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Trampoline Gymnastics','M','Men','Bronze','CHN','GAO Lei');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Trampoline Gymnastics','W','Women','Gold','CAN','MACLENNAN Rosannagh');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Trampoline Gymnastics','W','Women','Silver','GBR','PAGE Bryony');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Trampoline Gymnastics','W','Women','Bronze','CHN','LI Dan');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Triathlon','M','Men','Gold','GBR','BROWNLEE Alistair');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Triathlon','M','Men','Silver','GBR','BROWNLEE Jonathan');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Triathlon','M','Men','Bronze','RSA','SCHOEMAN Henri');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Triathlon','W','Women','Gold','USA','JORGENSEN Gwen');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Triathlon','W','Women','Silver','SUI','SPIRIG HUG Nicola');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Triathlon','W','Women','Bronze','GBR','HOLLAND Vicky');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Volleyball','M','Men','Gold','BRA','Brazil');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Volleyball','M','Men','Silver','ITA','Italy');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Volleyball','M','Men','Bronze','USA','United States');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Volleyball','W','Women','Gold','CHN','China');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Volleyball','W','Women','Silver','SRB','Serbia');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Volleyball','W','Women','Bronze','USA','United States');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Water Polo','M','Men','Gold','SRB','Serbia');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Water Polo','M','Men','Silver','CRO','Croatia');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Water Polo','M','Men','Bronze','ITA','Italy');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Water Polo','W','Women','Gold','USA','United States');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Water Polo','W','Women','Silver','ITA','Italy');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Water Polo','W','Women','Bronze','RUS','Russian Federation');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Weightlifting','M','Men''s +105kg','Gold','GEO','TALAKHADZE Lasha');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Weightlifting','M','Men''s +105kg','Bronze','GEO','TURMANIDZE Irakli');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Weightlifting','M','Men''s +105kg','Silver','ARM','MINASYAN Gor');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Weightlifting','M','Men''s 105kg','Gold','UZB','NURUDINOV Ruslan');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Weightlifting','M','Men''s 105kg','Bronze','KAZ','ZAICHIKOV Alexandr');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Weightlifting','M','Men''s 105kg','Silver','ARM','MARTIROSYAN Simon');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Wrestling','M','Men''s Freestyle 125 kg','Gold','TUR','AKGUL Taha');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Wrestling','M','Men''s Freestyle 125 kg','Bronze','BLR','SAIDAU Ibrahim');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Wrestling','M','Men''s Freestyle 125 kg','Bronze','GEO','PETRIASHVILI Geno');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Wrestling','M','Men''s Freestyle 125 kg','Silver','IRI','GHASEMI Komeil Nemat');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Wrestling','M','Men''s Freestyle 57 kg','Gold','GEO','KHINCHEGASHVILI Vladimer');
    Insert into olympic_medal_winners (OLYMPIC_YEAR,SPORT,GENDER,EVENT,MEDAL,NOC,ATHLETE) values (2016,'Wrestling','M','Men''s Freestyle 57 kg','Bronze','AZE','ALIYEV Haji');
    
    -- This pivots the results by medal. But the columns not listed 
    -- in the pivot form an implicit group by. So this gives the 
    -- medal total per athlete per event.
    
    select * from olympic_medal_winners   
    pivot ( count(*) for medal in (   
     'Gold' gold, 'Silver' silver, 'Bronze' bronze   
    ))   
    order by noc   
    fetch first 6 rows only;
    
    -- To overcome the problem in the previous statement, 
    -- his selects just the columns you need in the subquery. 
    -- But some events have multiple people who win the 
    -- same medal - e.g. doubles tennis. This pivot 
    -- counts rows in the table, not individual events.
    
    select * from (   
     select noc, medal from olympic_medal_winners   
    )   
    pivot ( count(*) for medal in (   
     'Gold' gold, 'Silver' silver, 'Bronze' bronze   
    ))   
    order  by 2 desc, 3 desc, 4 desc   
    fetch first 5 rows only;
    
    -- This solves the over counting problem in the 
    -- previous statement. It does this by finding the 
    -- distinct values for sport, event and gender 
    -- then counting the results.
    
    select * from (   
     select noc, medal, sport, event, gender   
     from olympic_medal_winners   
    )   
    pivot ( count(distinct sport ||'#'|| event ||'#'||gender ) for medal in (   
     'Gold' gold, 'Silver' silver, 'Bronze' bronze   
    ))   
    order  by 2 desc, 3 desc, 4 desc   
    fetch first 5 rows only;
    
    -- You can have many functions in the pivot. 
    -- Oracle generates a column for each function 
    -- per value in the in clause. This finds the 
    -- gold medal winning countries. For each it shows:
    -- - The number of different events these were won in
    -- - The number of different sports thy were won in
    -- - The names of the athlete or team who won each medal
    -- Finally it filters to only show those countries 
    -- that won at least two gold medals.
    
    select * from (   
      select noc, medal, sport, event, gender, athlete   
      from   olympic_medal_winners   
    )   
    pivot  (    
      count( distinct sport ||'#'|| event ||'#'|| gender ) medals,   
      count( distinct sport ) sports,   
      listagg( athlete, ',') within group (order by athlete) athletes   
      for medal in ( 'Gold' gold )   
    )   
    where  gold_medals > 1   
    order  by gold_medals, gold_sports, noc   
    fetch  first 5 rows only;
    
    -- This is similar to the previous query. 
    -- But it finds those countries whose IOC code starts with D.
    
    select * from (   
      select noc, medal, sport, event, gender, athlete   
      from   olympic_medal_winners   
    )   
    pivot  (    
      count( distinct sport ||'#'|| event ||'#'|| gender ) medals,   
      count( distinct sport ) sports,   
      listagg( athlete, ',') within group (order by athlete) athletes   
      for medal in ( 'Gold' gold )   
    )   
    where  noc like 'D%'   
    order  by gold_medals;
    
    -- This produces a matrix, sports across the 
    -- top countries down the side. There's an X 
    -- for each sport that country has a row in the table for.
    
    select * from (     
      select noc, sport     
      from   olympic_medal_winners     
    )     
    pivot (min('X') for sport in (     
      'Archery' as arc, 'Athletics' as ath, 'Hockey' as hoc,   
      'Judo' as jud, 'Sailing' as sai, 'Wrestling' as wre  
      )      
    )     
    order  by noc     
    fetch  first 7 rows only;
    
    -- This is the old school, pre Oracle Database 
    -- 11g method for pivoting data.
    
    select noc,    
           count ( case when medal = 'Gold' then 1 end ) gold_medals,    
           count ( case when medal = 'Silver' then 1 end ) silver_medals,    
           count ( case when medal = 'Bronze' then 1 end ) bronze_medals   
    from   olympic_medal_winners   
    group  by noc   
    order  by 2 desc, 3 desc, 4 desc   
    fetch first 5 rows only;
    
    -- An example of how to build the pivot clause 
    -- values dynamically. 
    
    -- Note that when you do this the number of 
    -- columns can change between runs. So the 
    -- execute and fetch routine will be far 
    -- more complex in a real world scenario!
    
    declare   
      sql_stmt     clob;   
      pivot_clause clob;   
    begin   
      select listagg('''' || sport || ''' as "' || sport || '"', ',') within group (order by sport)    
      into   pivot_clause   
      from   (select distinct sport from olympic_medal_winners);   
       
      sql_stmt := 'select * from (select noc, sport from olympic_medal_winners)   
    pivot (count(*) for sport in (' || pivot_clause || '))';   
       
      dbms_output.put_line( sql_stmt );   
       
      execute immediate sql_stmt;   
    end;  
    /
    
    -- The XML keyword dynamically builds the 
    -- list of values to pivot. But you get the 
    -- results in XML! Each "column" is an 
    -- element in this document.
    
    select * from (   
     select noc, sport   
     from   olympic_medal_winners   
    )   
    pivot xml (count(*) medal_winners for sport in (   
     select sport   
     from   olympic_medal_winners   
     where  sport like 'A%')    
    )   
    where rownum = 1;
    
    -- This previous example gave every country 
    -- at least one medal in every sport! To 
    -- avoid this, you need to count a column 
    -- which will be null if the country 
    -- didn't win in a particular event. 
    
    select * from (   
     select noc, sport, athlete   
     from olympic_medal_winners   
    )   
    pivot xml (count(athlete) medal_winners for sport in (   
     select sport   
     from   olympic_medal_winners   
     where  sport like 'A%')    
    )   
    where rownum = 1;
    
    -- This creates the final medal table 
    -- for the unpivot example below.
    
    create table olympic_medal_tables as   
    select * from (   
     select noc, medal, sport, event, gender   
     from olympic_medal_winners   
    )   
    pivot ( count(distinct sport ||'#'|| event ||'#'||gender ) for medal in (   
     'Gold' gold_medals, 'Silver' silver_medals, 'Bronze' bronze_medals   
    ))   
    order  by 2 desc, 3 desc, 4 desc;
    
    -- Unpivot takes the columns and 
    -- converts them back to rows. 
    
    select * from olympic_medal_tables   
    unpivot (medal_count for medal_colour in (   
      gold_medals as 'GOLD',   
      silver_medals as 'SILVER',   
      bronze_medals as 'BRONZE'   
    ))   
    order  by noc   
    fetch  first 6 rows only;
    drop table olympic_medal_tables purge;
    create table olympic_medal_tables as  
    select * from (  
     select noc, medal, sport, event, gender  
     from olympic_medal_winners  
    )  
    pivot ( count(distinct sport ||'#'|| event ||'#'||gender ) medals,   
      count(distinct sport) sports  
      for medal in (  
     'Gold' gold, 'Silver' silver, 'Bronze' bronze  
    ))  
    order  by 2 desc, 4 desc, 6 desc;
    
    -- You can unpivot two or more columns 
    -- to a single row. To do this, provide a 
    -- list of the columns you want to combine. 
    -- You then get a column for each in the results.
    
    select * from olympic_medal_tables   
    unpivot ((medal_count, sport_count) for medal_colour in (   
      (gold_medals, gold_sports) as 'GOLD',    
      (silver_medals, silver_sports) as 'SILVER',    
      (bronze_medals, bronze_sports) as 'BRONZE'   
    ))   
    fetch first 9 rows only;
    drop table olympic_medal_tables purge;
    create table olympic_medal_tables as  
    select * from (  
     select noc, medal, sport, event, gender, athlete  
     from olympic_medal_winners  
    )  
    pivot ( count(distinct sport ||'#'|| event ||'#'||gender ) medals,   
      listagg(athlete, ',') within group (order by athlete) athletes  
      for medal in (  
     'Gold' gold, 'Silver' silver, 'Bronze' bronze  
    ))  
    order  by 2 desc, 4 desc, 6 desc;
    
    -- Another example of unpivoting multiple columns. 
    -- This time with a list of athletes.
    
    select * from olympic_medal_tables   
    unpivot ((medal_count, athletes) for medal_colour in (   
      (gold_medals, gold_athletes) as 'GOLD',    
      (silver_medals, silver_athletes) as 'SILVER',    
      (bronze_medals, bronze_athletes) as 'BRONZE'   
    ))   
    where  medal_colour = 'GOLD'   
    and    medal_count = 2   
    order  by noc   
    fetch first 3 rows only;
    
    -- This first unpivots the results to get the 
    -- list of athletes won two gold medals. It 
    -- then uses XML tokenization to split 
    -- the list into a row per person.
    
    with rws as (   
      select * from olympic_medal_tables   
      unpivot ((medal_count, athletes) for medal_colour in (   
        (gold_medals, gold_athletes) as 'GOLD',    
        (silver_medals, silver_athletes) as 'SILVER',    
        (bronze_medals, bronze_athletes) as 'BRONZE'   
      ))   
      where  medal_colour = 'GOLD'   
      and    medal_count = 2   
    )   
      select noc, athlete   
      from   rws, xmltable (   
        'if (contains($X,",")) then ora:tokenize($X,"\,") else $X'   
        passing athletes as X   
        columns athlete varchar2(4000) path '.'   
      )   
      order  by 1, 2   
      fetch first 6 rows only;
    
    -- This creates the table of medals won by 
    -- each country per sport for use in the 
    -- examples below.
    
    create table olympic_country_sport_medals as   
    select * from (   
      select noc, sport   
      from   olympic_medal_winners   
    )   
    pivot (count(sport) for sport in (   
      'Athletics' as ath, 'Artistic Gymnastics' as gym, 'Cycling Track' as cyc,  
      'Boxing' as box, 'Sailing' as sai  
      )    
    )   
    order  by 1;
    
    -- This switches the rows and columns over 
    -- aka a transpose. It does so by chaining 
    -- a pivot followed by an unpivot.
    
    select * from olympic_country_sport_medals   
    pivot  (   
      sum(ath) ath, sum(box) box, sum(gym) gym, sum(sai) sai, sum(cyc) cyc   
      for noc in ('BRA' BRA, 'CHN' CHN, 'DEN' DEN, 'ESP' ESP, 'ETH' ETH, 'GRE' GRE )   
    )   
    unpivot (   
      (BRA, CHN, DEN, ESP, ETH, GRE ) for sport in (   
      (BRA_ATH, CHN_ATH, DEN_ATH,  ESP_ATH,  ETH_ATH, GRE_ATH) as 'Athletics',   
      (BRA_GYM, CHN_GYM, DEN_GYM,  ESP_GYM,  ETH_GYM, GRE_GYM) as 'Artistic Gym',   
      (BRA_BOX, CHN_BOX, DEN_BOX,  ESP_BOX,  ETH_BOX, GRE_BOX) as 'Boxing',   
      (BRA_SAI, CHN_SAI, DEN_SAI,  ESP_SAI,  ETH_SAI, GRE_SAI) as 'Sailing',   
      (BRA_CYC, CHN_CYC, DEN_CYC,  ESP_CYC,  ETH_CYC, GRE_CYC) as 'Track Cycling'   
      )   
    );
    
    -- Tranposing data using unpivot and pivot. 
    -- Much easier to write than the other way around!
    
    select * from olympic_country_sport_medals   
    unpivot (   
      (medals) for sport in ( ath, box, gym, sai, cyc )   
    )   
    pivot  (   
      sum(medals) for noc in (   
        'BRA' BRA, 'CHN' CHN, 'DEN' DEN, 'ESP' ESP, 'ETH' ETH, 'GRE' GRE  
      )   
    );
  • Python and the Oracle Autonomous Database: Three Ways to Connect

    Python and the Oracle Autonomous Database: Three Ways to Connect

    Watch the deep dive videos:

    Part I

    Overview and connecting with the python-oracledb library

    Part II

    Connecting with Oracle REST APIs unauthenticated

    Part III

    Custom Oracle REST APIs with OAuth2.0 Authorization

    Welcome back

    I finally had a break in my PM duties to share a small afternoon project [I started a few weeks ago]. I challenged myself to a brief Python coding exercise. I wanted to develop some code that allowed me to connect to my Autonomous Database using either our python-oracledb driver (library) or with Oracle REST Data Services (ORDS).

    I undertook this effort as I also wanted to make some comparisons and maybe draw some conclusions from these different approaches.

    NOTE: If you don't feel like reading this drivel, you can jump straight to the repository where this code lives. It's all nicely commented and has everything you need to get it to work. You can check that out here. 

    The test files

    Reviewing the code, I’ve created three Python test files. test1.py relies on the python-oracledb library to connect to an Oracle Autonomous database while test2.py and test3.py rely on ORDS (test3.py uses OAuth2.0, but more on that later).

    Configuration

    Configuration directory

    I set up this configuration directory (config_dir) to abstract sensitive information from the test files. My ewallet.pem and tnsnames.ora files live in this config_dir. These are both required for Mutual TLS (mTLS) connection to an Oracle Autonomous database (you can find additional details on mTLS in the docs here).

    config directory files first test case, chris hoina, senior product manager, oracle rest apis, database tools
    ewallet.pem and tnsnames.ora files

    Other files

    oauth testyurls and wallet creds files for tests, chris hoina, senior product manager, ords, oracle rest apis, database tools, sqlcl
    OAuth2.0, Test URLs, and Wallet Credential files

    Other files include oauth2creds.py, testurls.py, and walletcredentials.py. Depending on the test case, I’ll use some or all of these files (you’ll see that shortly).

    NOTE: If not obvious to you, I wouldn't put any sensitive information into a public git repository. 

    Connecting with python-oracledb

    One approach to connecting via your Oracle database is with the python-oracledb driver (library). An Oracle team created this library (people much more experienced and wiser than me), and it makes connecting with Python possible.

    FYI: I’m connecting to my Autonomous Database. If you want to try this, refer to the documentation for using this library and the Autonomous database. You can find that here.

    The Python code that I came up with to make this work:

    #Connecting to an Oracle Autonomous Database using the Python-OracleDB driver.
    
    import oracledb
    
    # A separate python file I created and later import here. It contains my credentials, so as not to show them in this script here. 
    
    from walletcredentials import uname, pwd, cdir, wltloc, wltpwd, dsn
    
    # Requires a config directory with ewallet.pem and tnsnames.ora files.
    
    with oracledb.connect(user=uname, password=pwd, dsn=dsn, config_dir=cdir, wallet_location=wltloc, wallet_password=wltpwd) as connection:
        with connection.cursor() as cursor:
    
    # SQL statements should not contain a trailing semicolon (โ€œ;โ€) or forward slash (โ€œ/โ€).
    
            sql = """select * from BUSCONFIND where location='ZAF'
            order by value ASC """
            for r in cursor.execute(sql):
                print(r)

    In Line 7, you can see how I import the wallet credentials from the walletcredentials.py file. Without that information, this code wouldn’t work. I also import the database username, password, and configuration directory (which includes the ewallet.pem and tnsnames.ora files).

    From there, the code is pretty straightforward. However, some library-specific syntax is required (the complete details are in the docs, found here), but aside from that, nothing is too complicated. You’ll see the SQL statement in Lines 16-17; the proper SQL format looks like this:

    SELECT  * FROM busconfind WHERE location='zaf'
    ORDER BY value ASC;

    And here is an example of this SQL output in a SQL Worksheet (in Database Actions):

    testing sql in database actions for python-oracledb driver, chris hoina, senior product manager, oracle rest apis, ords, python, autonomous database
    Reviewing the SQL in Database Actions
    FYI: This is a Business Confidence Index data-set, in case you were curious (retrieved here). 

    That SQL allows me to filter on a Location and then return those results in ascending order according to the Value column. When I do this using the python-oracledb driver, I should expect to see the same results.

    NOTE: You've probably noticed that the SQL in the python file differs from that seen in the SQL Worksheet. That is because you need to escape the single quotes surrounding ZAF, as well as remove the trailing semi-colon in the SQL statement. Its all in the python-oracledb documentation, you just have to be aware of this.

    Once I have all the necessary information in my walletcredentials.py file, I can import that into the test1.py file and execute the code. I chose to run this in an Interactive Window (I’m using VS Code), but you can also do this in your Terminal. In the images (from left to right), you’ll see the test1.py file, then a summary of the output from that SQL query (contained in the test1.py code), and finally, the detailed output (in a text editor).

    Wrap-up

    For those that have an existing Free Tier tenancy, this could be a good option for you. Of course, you have to do some light administration. But if you have gone through the steps to create an Autonomous database in your cloud tenancy, you probably know where to look for the tnsnames.ora and other database wallet files.

    I’m not a developer, but I think it would be nice to simplify the business logic found in this Python code. Maybe better to abstract it completely. For prototyping an application (perhaps one that isn’t micro services-oriented, this could work) or for data- and business analysts, this could do the trick for you. In fact, the data is returned to you in rows of tuples; so turning this into a CSV or reading it into a data analysis library (such as pandas) should be fairly easy!

    Connecting via ORDS: sans OAuth2.0

    Auto-REST and cURL

    I’m still using the “devuser” (although this may be unnecessary, as any unsecured REST-enabled table would do). I’m using the same table as before; the only change I’ve made is to auto-REST enable the BUSCONFIND table for the test2.py code.

    In the following images, I’m retrieving the cURL command for performing a GET request on this table.

    NOTE: In a recent ORDS update, we made available different shell variations (this will depend on your OS); I've selected Bash.

    From there, I take the URI (learn more on URIs) portion of the cURL command and place it into my browser. Since this table is auto-REST enabled, I’ll only receive 25 rows from this table.

    NOTE: The ORDS default pagination is limit = 25.

    The code

    And the code for this test2.py looks like this:

    # Auto-REST enabled with ORDS; in an Oracle Autonomous Database with query parameters.
    
    import requests
    import pprint
    
    # Importing the base URI from this python file.
    
    from testurls import test2_url
    
    # An unprotected endpoint that has been "switched on" with the ORDS Auto-REST enable feature. 
    # Query parameters can be added/passed to the Base URI for GET-ing more discrete information.
    
    url = (test2_url + '?q={"location":"ZAF","value":{"$gt":100},"$orderby":{"value":"asc"}}}')
    
    # For prototyping an application, in its earlier stages, this could really work. On your front end, you 
    # expect the user to make certain selections, and you'll still pass those as parameters. 
    # But here, you do this as a query string. In later stages, you may want to streamline your application
    # code by placing all this into a PL/SQL or SQL statement. Thereby separating application
    # logic and business logic. You'll see this approach in the test3.py file. 
    # This works, but you can see how it gets verbose, quick. Its a great jumping-off point.
    
    responsefromadb = requests.get(url)
    pprint.pprint(responsefromadb.json())

    Lines 8 and 13 are the two areas to focus on in this example. In Line 8 imported my URL from the testurls.py file (again, abstracting it, so it’s not in the main body of the code).

    test2 python file and testurls file for second test, chris hoina, senior product manager, oracle rest apis, ords, python, autonomous database
    The test2.py and testurls.py files

    And then, in Line 13, I appended a query string to the end of that URL. ORDS expects the query parameters to be a JSON object with the following syntax:

    [ORDS Endpoint]/?q={"JSON Key": "JSON Value"}

    The new, complete query string below requests the same information as was requested in the test1.py example:

    url = (test2_url + '?q={"location":"ZAF","value":{"$gt":100},"$orderby":{"value":"asc"}}}')

    This string begins with that same BASE URI for the ORDS endpoint (the auto-REST enabled BUSCONFIND table) and then applies the query string prefix “?q=” followed by the following parameters:

    1. Filter by the location "ZAF"
    2. Limit the search of these locations to values (in the Value column) greater than ($gt) 100
    3. Return these results in ascending order (asc) of the Value column
    NOTE: You can manipulate the offsets and limits in the python-oracledb driver too. More info found here. And filtering in queries with ORDS can be found here. 

    And if I run the test2.py code in the VS Code Interactive Window, I’ll see the following summary output.

    summary output from python test2, chris hoina, senior product manager, oracle rest apis, ords, python, autonomous database
    Summary output from the response in test2.py

    Here is a more detailed view in the VS Code text editor:

    detailed output from python test2, chris hoina, senior product manager, oracle rest apis, ords, python, autonomous database
    Detailed output with helpful links

    Wrap-up

    A slightly different approach, right? The data is all there, similar to what you saw in the test1.py example. There are a few things to note, though:

    • The consumer of this ORDS REST API doesn’t need access to the database (i.e. you don’t need to be an admin or have a schema); you can perform GET requests on this URI.
    • The response body is in JSON (ubiquitous across the web and web applications)
      • Also, language and framework agnostic (the JSON can be consumed/used widely, and not just with Python)
    • You are provided a URI for each item (i.e. entry, row, etc.)
    • No need for SQL; just filter with the JSON query parameters
    • No business logic in the application code
    • Needless to say, no ORMs or database modeling is required for this approach

    However…security is, ahem…nonexistent. That is a problem and flies in the face of what we recommend in our ORDS Best Practices.

    Connecting via ORDS: secured with OAuth2

    Note: This is an abbreviated explanation, I'll be posting an expanded write-up on this example post haste!

    Since this is what I’m considering “advanced” (it’s not difficult, there are just many pieces) I’m going to keep this section brief. Long story short, I’ll take those query parameters from above and place them into what is referred to as a Resource Handler.

    TIME-OUT: Auto-REST enabling a database object (the BUSCONFIND table in this case) is simple in Database Actions. Its a simple left-click > REST-enable. You saw that in the previous example. You are provided an endpoint and you can use the query parameters (i.e. the JSON {key: value} pairs) to access whatever you need from that object. 
    
    However, creating a custom ORDS REST endpoint is a little different. First you create a Resource Module, next a (or many) Resource Template/s, and then a (or many) Resource Handler/s. In that Resource Handler, you'll find the related business logic code for that particular HTTP operation (the menu includes: GET, POST, PUT, and DELETE). 

    The Resource Module

    The process of creating a custom ORDS API might be difficult to visualize, so I’ll include the steps I took along with a sample query (in that Resource Handler) to help illustrate.

    Chances are you may be the administrator of your Always Free tenancy, so you have full control over this. Other times, you might be provided the REST endpoint. In that case, you may not ever have to worry about these steps. Either way, you can see how we’re simulating (as well as both abstracting and keeping the business logic in the database) the query with this final example (test3.py).

    Security

    The OAuth 2.0 authorization framework enables a third-party application to obtain limited access to an HTTP service, either on behalf of a resource owner by orchestrating an approval interaction between the resource owner and the HTTP service, or by allowing the third-party application to obtain access on its own behalf.

    RFC 6749: The OAuth 2.0 Authorization Framework

    I’ll keep this section brief, but I’m protecting this resource through the aid of an ORDS OAuth2.0 client. I’ve created one here:

    creating an oauth2 client for test3 example, chris hoina, senior product manager, oracle rest apis, ords, python, autonomous database
    After creating a client you can use the provided URL for requesting a new Bearer Token

    And, as you’ll see shortly, I’ll rely on some Python libraries for requesting an Authorization Token to use with the related Client ID and Client Secret. If you want to nerd out on the OAuth2.0 framework, I challenge you to read this.

    test3.py example

    NOTE: Remember, I'm keeping this section intentionally brief. It deserves a slightly deeper dive, and class is almost over (so I'm running out of time). 

    The code for this example:

    # Custom ORDS Module in an Oracle Autonomous Database.
    
    import requests
    from requests_oauthlib import OAuth2Session
    from oauthlib.oauth2 import BackendApplicationClient
    import pprint
    import json
    
    # Importing the base URI from this python file.
    
    from testurls import test3_url
    
    # A separate python file I created and later import here. It contains my credentials, 
    # so as not to show them in this script here. 
    
    from oauth2creds import token_url, client_id, client_secret
    
    token_url = token_url
    client_id = client_id
    client_secret = client_secret
     
    client = BackendApplicationClient(client_id=client_id)
    oauth = OAuth2Session(client=client)
    
    token = oauth.fetch_token(token_url, client_id=client_id, client_secret=client_secret)
    
    bearer_token = token['access_token']
    
    # Location can be anything from the table. Now, only the single variable needs to be passed. Business logic has been abstracted somewhat; as it now resides within
    # ORDS. This could make your application more portable (to other languages and frameworks, since there are fewer idiosyncracies and dependencies):
    
    location = "ZAF"
    
    # print(location)
    
    # ------------------------------------------------------------------------------ # 
    # In Database Actions, we:
    #   1. Create an API Module
    #   2. Then create a Resource Template
    #   3. Finally, a GET Resource Handler that consists of the code from test1.py:
    
    #           select * from BUSCONFIND where location= :id
    #               order by value ASC
    # ------------------------------------------------------------------------------ # 
    url = (test3_url + location)
    # print(url)
    
    responsefromadb = requests.get(url, headers={'Authorization': 'Bearer ' + bearer_token}).json()
    
    # This step isn't necessary; it simply prints out the JSON response object in a more readable format.
    
    pprint.pprint(responsefromadb)

    Lines 11 and 16 deserve some attention here. The URL for Line 11 comes from the testurls.py file; seen in the previous example. And the contents from Line 16 come from the oauth2creds.py file. Here are the files, side-by-side:

    test3 python oauthcreds and test url files, chris hoina, senior product manager, oracle rest apis, ords, python, autonomous database
    The test3.py, testurls.py, and oauth2creds.py files

    As you can see in the testurls.py file, I’m relying on the test3_url for this example. And the OAuth2.0 information you see comes directly from the OAuth Client I created in Database Actions:

    creating an oauth2 client for test3 example, chris hoina, senior product manager, oracle rest apis, ords, python, autonomous database
    In this image, you can see the Client ID and Client Secret

    If I put that all together, I can execute the code in test3.py and “pretty print” the response in my Interactive Window. But first I need to adjust the Resource Handler’s URI (the one I copied and pasted from the “REST Workshop”). It retains the “:id” bind parameter. But the way I have this Python code set up, I need to remove it. It ends up going from this:

    test3_url = 'https://test3_url = 'https://abc123def456ghi789-darmok.adb.us-ashburn-1.oraclecloudapps.com/ords/devuser/myapi/locations/:id'

    To this:

    test3_url = 'https://abc123def456ghi789-darmok.adb.us-ashburn-1.oraclecloudapps.com/ords/devuser/myapi/locations/'

    Otherwise, if I don’t remove that bind parameter when the code executes, the target URL will end up looking like this:

    'https://abc123def456ghi789-darmok.adb.us-ashburn-1.oraclecloudapps.com/ords/devuser/myapi/locations/:idZAF'

    With that out of the way, I can run this code and review the output.

    From top-to-bottom, left-to-right you’ll see I first execute the code in the Interactive Window. From there I can review a summary of the response to my GET request. That pretty print library allows us to see the JSON array in a more readable format (one that has indentation and nesting); which you can see in the second image. The third image is a more detailed view of the first half of this response. And I include the final image to highlight the helpful URLs that are included in the response body.

    Since I know my limit = 25, and the 'hasMore': True (seen in the output in that third image) exists, I know there are more items. You can adjust the limit and offset in subsequent requests, but I’ll save that for another day.

    Wrap-up

    You can probably tell, but this is like an expansion of the previous example. But instead of relying on the auto-REST enabling, you are in full control of the Resource Module. And while you don’t need to use OAuth2.0 it’s good practice to use it for database authentication. You can see how the response comes through a little differently, compared to the previous example, but still very similar.

    In this example, I did all the work, but that might not be the case for you; much of it might be handled for you. The main thing I like about this example is that we rely on stable and popular Python libraries: requests, requests_oauthlib, and oautlib.

    The fact that this is delivered as a JSON object is helpful as well (for the same reasons mentioned in the second example). And finally, I enjoy the fact that you only need to pass a single parameter from your (assumed) presentation layer to your application layer; an example might be a selection from an HTML form or drop-down menu item.

    The end

    We’re at the end of this fun little exercise. As I mentioned before, I will expand on this third example. There are so many steps, and I think it would be helpful for people to see a more detailed walk-through.

    And be on the lookout (BOLO) for a video. There’s no way around this, but a video needs to accompany this post.

    And finally, you can find all the code I review in this post in my new “blogs” repository on GitHub. I encourage you to clone, fork, spoon, ladle, knife, etc…

    Oh and…

  • Podman, Oracle Container Registry, and SQLcl

    Podman, Oracle Container Registry, and SQLcl

    Containers are kind of like terrariums.


    Table of Contents

    Click a link to jump straight to that section.

    1. Intro to Podman
    2. What I am trying to do
    3. Getting all the prerequisites
    4. Installing Podman Part 1
    5. Resolving X-code (xcrun) problems
    6. Installing Podman Part 2
    7. Exploring the Oracle Container Registry
    8. Getting to know the container
    9. Reviewing container custom configurations
    10. Attempting to understand ports
    11. Starting the Podman Linux VM
    12. Reviewing database logs
    13. Changing the default password
    14. Logging on with SQLcl
    15. Creating, loading a new table
    16. SQL query on the new table
    17. About data persisting in Podman
    18. Dรฉnouement

    Intro to Podman, man

    What is Podman?

    Podman is a daemonless container engine for developing, managing, and running OCI Containers on your Linux System. Containers can either be run as root or in rootless mode. Simply put: alias docker=podman.

    the deets

    I’ve spent the past couple of weeks setting up Podman to work on my MacBook.

    my macbook pro setup, chris hoina, senior product manager, oracle database tools, oracle cloud, ords, oracle rest apis, sql developer, podman, sqlcl
    My current setup

    I really wanted to take advantage of our Oracle Container Registry. While, we have several containers, the one I’m most interested in is the [Ace] freely available Oracle Enterprise Database version 21.3.0.0.

    FYI: It can be found in the Database containers category.

    I wanted to learn more about containers while also connecting locally (i.e., from my MacBook) via SQLcl to said container. In that scenario, as far as my computer thinks, the container is a production database running elsewhere in the world. Oh, and I’m using Podman instead of Docker to do all this. 

    A summary of what I intend to do

    In short, wanted to: 

    • provision a database inside a container
    • be able to start and stop it at will, and 
    • be able to connect to it with various Oracle database tools

    I began this exercise with SQLcl since it was used in one of my recent posts. But as a follow-on to this article, I’d like to install ORDS on my local computer and then connect again but with ORDS joining the party. But that’s for another time. 

    Let’s make ah the pizza

    But before connecting to this container, you’ll need a lot of prerequisites. As far as “ingredients” go, you’ll need the following:

    • Homebrew installed and updated. (If you need to do this, review my recent article for instructions.)
    •  Podman installed and updated
    •  Apple X-Code Command Line Tools updated (this is tricky, so check my notes below)
    •  SQLcl (you can review the installation steps here)
    •  A sample CSV file (The subject doesn’t matter; I grabbed one from Kaggle – “IMDb’s Top 100 Movies“)

    The setup before the setup

    Since this was such a huge PITA, I’m going to walk through all the steps I took to make Podman work on my MacBook. I’ve done this about ten times so far to make sure I’m clearly explaining every step I took.

    I first opened up a new Terminal session using Spotlight (Left Command + Spacebar). Once in Spotlight, I searched for “terminal” and then hit enter. A new Terminal window will appear.

    From there, I reviewed Homebrew using the brew list command. If you’re following along, you’ll see a list similar to mine, depending on what you have installed.

    reviewing brew installations in terminal window, chris hoina, senior product manager, oracle database tools, oracle cloud, ords, oracle rest apis, sql developer, podman, sqlcl
    Wo ist mein (or is it meine?) Podman?

    Attempting to install Podman on Mac

    The next part is easy. I installed Podman with the following command: brew install podman. Homebrew will run through an auto-update, and eventually, Podman will begin installing.

    Upon the first installation, and depending on the macOS you are on, you may see a couple of errors appear. I can tell you they will cause issues within Podman later on down the line. So (and without having to take you back down the rabbit hole with me), you’ll need to uninstall Podman with the brew uninstall podman command.

    The errors can be seen in this image:

    errors found when installing podman, chris hoina, senior product manager, oracle database tools, oracle cloud, ords, oracle rest apis, sql developer, podman, sqlcl
    Errors with Podman installing

    There are a few ways one can remedy this. First, you should uninstall Podman, close your Terminal window, and open up a new Terminal window. I found (via this GitHub issue) that this is a known bug. Some have suggested running the brew doctor command to review a list of possible problems (this will reveal any potential problems Homebrew has discovered). This seems like a good practice, regardless, and I wasn’t aware of this feature until now!

    And while writing this article, I did just that and found two errors I’ll need to fix. I’m still trying to figure out what either means, but the one about the executable is troubling.

    errors found when using brew doctor command, chris hoina, senior product manager, oracle database tools, oracle cloud, ords, oracle rest apis, sql developer, podman, sqlcl
    Errors found with brew doctor command

    X-Code problems

    But back to the Podman issue. To resolve the xcrun errors, I stumbled upon a solution buried deep in the recesses of the internet. Long story short, I needed to manually install Apple’s X-Code Command Line tools. But if you try and the installation fails, you have to take an extra step.

    77 hours to install xcode command line tools, chris hoina, senior product manager, ords, db tools
    BTW, it did NOT take 78 hours to download and install

    The x-tra step

    If the xcode-select --install command fails, you have to remove the Command Tools from your machine altogether. I did this with the following command:

    sudo rm -rf /Library/Developer/CommandTools

    If you want to bore yourself with this issue, here are some resources I found: 


    On sudo

    I’ve seen sudo in the past; I wonder if I ever bothered to look up its meaning. Taken directly from the docs:

    Sudo (su โ€œdoโ€) allows a system administrator to give certain users (or groups of users) the ability to run some (or all) commands as the superuser or another user, while logging all commands and arguments. Sudo operates on a per-command basis, it is not a replacement for the shell.

    A short introduction

    Back to our regularly scheduled program…you’ll probably need to enter your system’s password (the thing you use to log on to your computer when it first starts up and/or wakes). And after that, restart your Terminal (I don’t believe I did this, but it’s probably a good idea to restart the Terminal).

    Once that new Terminal window fired up, I used the following command to install the latest X-Code Command Line tools:

    sudo xcode-select --install

    Reminder, it will not take 78 hours to install this. I just followed the prompts (license terms, the usual stuff, etc.).

    NOTE: I suspect we have to do this because for some reason, X-Code Command Line tools are not updated upon every macOS version update. So, who knows when the last time these tools have been updated. This is just a hunch, but in reality, I've no idea what I'm talking about. 

    Installing Podman, second attempt

    Once that installation was completed, I restarted the Terminal and issued the command to install Podman:

    brew install podman 
    NOTE: If you want to follow along with the Podman instructions, you can do that here (those are the actual docs, very easy to do). 

    If all goes well, your output will look something like this:

    error free since 2023, brew install with podman, no issues, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database
    No errors after the updating Command Line tools

    If you issue the brew list command, you’ll see both SQLcl and Podman installed.

    brew list to check currently installed packages, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database
    Ready to roll with SQLcl and Podman
    PRO TIP: You can issue the brew upgrade sqlcl command to update to the latest brew-available version of SQLcl.

    To the Oracle Container Registry

    Are we ready yet? Well, almost. Again, if you’re following along, navigate to our Oracle Container Registry site to retrieve the database container for this “recipe.” The path I took was Landing page > Database > Enterprise.

    YOU NEED TO SIGN IN for this to work!!! Oh, suuuuure… it’ll seem like it’s working when you’re in Podman, and you’ve tried ten times…but then it just keeps failing, and failing, and failing! So be sure to sign in (or create an account if you haven’t already). 

    Once signed in and chosen your preferred language, you’ll see this:

    signed into oracle container registry, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database
    I’m ready to head back to the Terminal

    PAUSE: reviewing the limitations of this exercise

    Alright, so there are a few limitations I should address, and in no particular order:

    • Checkpointing containers in Podman
    • Volumes in Podman
    • Creating the database versus signing on to the database

    Checkpointing currently doesn’t work in Podman (at least for Macs on macOS Ventura). This is documented as well. Here’s a GitHub issue I found. I don’t seem to be the only one with the issue. I spent about a day on this trying to get it to work. I couldn’t figure it out; maybe if you’re reading this, you know the secret. Please share if you know!

    Secondly, I couldn’t figure out how to mount a volume to a container. I know this is fundamental to containers, but I encountered error after error for days. And for the purposes of this exercise, it isn’t a big deal. Now, if I were on an actual development team, that would be a different story. But I’m too dumb for development, that is why I’m a product manager ๐Ÿคฃ!

    Finally, working with containers requires a paradigm shift. Shortly you’ll see that I’m setting up a container and “starting” the database therein. Later, I’ll separately log on to that database, using SQLcl, after the database is up and running. They are two different steps.

    Working with the container

    Looking at this screen you would think, “I’m just going to jump right in and execute the first command I see on this page.” Wrong!

    first docker command on registry page, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database  copy
    Initial docker run command

    Actually, you do NOT want to do that. You must scroll down to the “Connecting from outside of the container” section. Because I’m going to be connecting to this container from the outside.

    referring to the custom configurations, oracle container registry page, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database  copy
    Referring to the Custom Configurations section

    I know this documentation mentions SQL*Plus, but this all applies to SQLcl also. And if you refer to my previous SQLcl post, you can review the logon syntax for logging on. The critical point is that I need to start the container with the -p (or Port) option included. Are you still with me? Let’s take a trip to the “Custom Configurations” section.

    Reviewing the custom configurations section

    custom configurations for connecting to the container, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database
    Referring to the -p configuration parameters

    I’ve cleaned up the code block for readability:

    docker run -d --name <container_name> \
     -p <host_port>:1521 -p <host_port>:5500 \
     -e ORACLE_SID=<your_SID> \
     -e ORACLE_PDB=<your_PDBname> \
     -e ORACLE_PWD=<your_database_password> \
     -e INIT_SGA_SIZE=<your_database_SGA_memory_MB> \
     -e INIT_PGA_SIZE=<your_database_PGA_memory_MB> \
     -e ORACLE_EDITION=<your_database_edition> \
     -e ORACLE_CHARACTERSET=<your_character_set> \
     -e ENABLE_ARCHIVELOG=true \
     -v [<host_mount_point>:]/opt/oracle/oradata \
    container-registry.oracle.com/database/enterprise:21.3.0.0
    
    Parameters:
    --name
        The name of the container (default: auto generated
     
    -p
        The port mapping of the host port to the container port.
        Two ports are exposed: 1521 (Oracle Listener), 5500 (OEM Express)
     
    -e ORACLE_SID
        The Oracle Database SID that should be used (default:ORCLCDB)
     
    -e ORACLE_PDB
        The Oracle Database PDB name that should be used (default: ORCLPDB1)
     
    -e ORACLE_PWD
        The Oracle Database SYS, SYSTEM and 
        PDBADMIN password (default: auto generated)
     
    -e INIT_SGA_SIZE
        The total memory in MB that should be used for all 
        SGA components (optional)
     
    -e INIT_PGA_SIZE
        The target aggregate PGA memory in MB that should be used 
        for all server processes attached to the instance (optional)
    
    -e ORACLE_EDITION
        The Oracle Database Edition (enterprise/standard, default: enterprise)
    
    -e ORACLE_CHARACTERSET
        The character set to use when creating the database (default: AL32UTF8)
     
    -e ENABLE_ARCHIVELOG
        To enable archive log mode when creating the database (default: false). 
        Supported 19.3 onwards.
     
    -v /opt/oracle/oradata
        The data volume to use for the database. Has to be writable by the 
        Unix "oracle" (uid: 54321) user inside the container If omitted the 
        database will not be persisted over container recreation.
     
    -v /opt/oracle/scripts/startup | /docker-entrypoint-initdb.d/startup
        Optional: A volume with custom scripts to be run after database startup.
        For further details see the "Running scripts after setup and on
        startup" section below.
     
     -v /opt/oracle/scripts/setup | /docker-entrypoint-initdb.d/setup
        Optional: A volume with custom scripts to be run after database setup.
        For further details see the "Running scripts after setup and on startup" 
        section below

    I believe the colons you see throughout the original code block (with certain exceptions) are there for the definitions (you wouldn’t actually include these in your commands). If you are coming from database development, I suspect some may think, “ahh, bind parameter.” I do not think that is the case here.

    Ports

    You might be asking, in this code block, what the hell am I supposed to be looking at? Well, the container has a “listener”, listening on port 1521. So if I want to connect to the container, I’ll need to “map” to it. I’m not sure if that is what it is called exactly (not a networking guy, don’t claim to be). But the next question is, what is my <host port> (How it is referred to in the code block above)?

    Everything matters, and nothing matters

    Executing a ping command in my Terminal, to see what my computer’s address is great, but it tells me nothing about the port.

    ping localhost to find ip address, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database
    Use ping localhost to see your IP address

    So I took to the internet to try to figure out the appropriate port…Honestly, I’ve tried searching but I can’t find anything definitive. In the Podman documentation, I see a lot of reference to port 8080; as in localhost:8080.

    Care to review it? Here are some search results using “8080” as the search parameter.

    Buried in the docs, there is a brief mention of the port and it can be found in the –publish or -p parameter. The way I understand ports on your local machine is that if you omit the local host information, you shouldn’t have any problems. It will just default to…something. So..it doesn’t matter, nothing matters. It’s all an illusion.

    I also reviewed the cURL documentation. I found something in the --connect-to option:

    curl connect-to option, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database
    Can you leave it empty?

    Aaaand, more port nonsense (if you are having trouble sleeping at night, try reading this):

    Starting the Podman Linux Virtual Machine

    Okay, with all this out of the way, I can finally start to make some progress (almost there, promise).

    Remember, you have to start the Podman Linux virtual machine before you do anything (this is in the instruction steps, so review that first (steps for macOS). This is where the container “lives.” Once the virtual machine is up and running.

    podman machine start success, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database
    Podman virtual machine is ready

    I then grabbed the Oracle container. But, since I’m using Podman I needed to modify the run command, like this:

    podman run -d -p :1521 --name myoracledb container-registry.oracle.com/database/enterprise:21.3.0.0
    REMINDER: Make sure you are logged into the Oracle Container Registry site before you attempt all this!

    Assuming you’re still following along, you’ll see something like this in your Terminal:

    Container has been created, and is starting, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database
    New container ID

    I used the podman ps command to check the status of the container. You should see something like this:

    starting container and healthy status, Container has been created, and is starting, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database
    “Starting” and “Healthy” statuses

    For several minutes, you’ll continue to see the container status as “starting”. You can do like me and just periodically enter the podman ps command, or you can go do something meaningful with your time. Check back in 10 mins or so. The choice is yours. Either way, you’ll eventually see the status change from “starting” to “healthy”.

    container is now healthy, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database
    The container is healthy and ready

    “Healthy” means I now have an Oracle Enterprise database (version 21.3.0.0) running in my Linux virtual machine as a container. I still need to log in with SQLcl, though.

    But wait…there’s more

    Hold up, I can’t just log into SQLcl. I still have some more setup to do. I need to reset the randomly generated password to one of my choosing. Our instructions tell you to first issue the docker logs + [your database name] command to view your database logs. And from there you should be able to locate it. I couldn’t maybe you can. Let me know if you were able to.

    podman logs your database name for password, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database
    Using the logs command

    Since I’m doing this in Podman, that command is slightly modified:

    podman logs myoracledb

    The printout of that command will appear like this (yours will be very similar). Although I wasn’t able to locate the password, there are still some important pieces of information that you’ll want to review and note.

    podman logs database output, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database
    podman logs [your database name] printout

    In this print out you’ll see things like the local host and port information, and the “Global Database Name” and “System Identifier (SID)” can be seen as well. You’ll see where the log files are located (your temporary password can be retrieved from here) and the database version you are running. Finally, you’ll see the message “DATABASE IS READY TO USE!”

    Use the included shell script for changing your password

    We are this close to logging onto the database. Even though I couldn’t find the temporary password, it doesn’t matter. You have to change your password anyways. If you refer back to the instructions on the Oracle Container Registry page, there is a section entitled “Changing the Default Password for SYS User” and it reads as such (emphasis added):

    On the first startup of the container, a random password will be generated for the database if not provided. The user [must] change the password after the database is created and the corresponding container is healthy.

    Using the docker exec command, change the password for those accounts by invoking the setPassword.sh script that is found in the container. Note that the container must be running. For example:

    $ docker exec <oracle-db> ./setPassword.sh <your_password>
    Oracle Docs

    Easy enough, and since my container is “healthy” at this point, I can execute this script. But since I’m using Podman, the command will look like this:

    podman container exec myoracledb ./setPassword.sh password1234

    And the output of that command will look like this:

    altering password with provided shell script, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database
    Automated password change with the provided Shell script

    I guess it worked. As you can see, my new password is password1234 (pleeeeease, do NOT share that with anybody). And at this point, I’m ridiculously close to logging onto this completely containerized Oracle enterprise database. All I need to do now is log on using the same steps as before (in my previous post).

    Connecting from outside the container

    Referring back to the Oracle Container Registry docs, I see the following:

    connecting from outside container, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database
    The different login options
    NOTE: Remember I'm logging into this container from the outside. 

    The connect options are cut-off in that image, so let me copy/paste them here. Also, assume where it states “sqlplus” I’ll be connecting with SQLcl. The options are as follows:

    $ sqlplus sys/<your_password>@//localhost:<exposed_port>/<your_SID> as sysdba
    
    $ sqlplus system/<your_password>@//localhost:<exposed_port>/<your_SID>
    
    $ sqlplus pdbadmin/<your_password>@//localhost:<exposed_port>/<your_PDBname>

    At this point, I can proceed. I have all the ingredients for this “recipe.”

    Use the port command, duh

    Turns out you can just use the port command to discover the container’s port (I’m guessing this is the route the container uses to communicate with my MacBook – it’s all quite muddled at this point).

    Here is the command I executed:

    podman port myoracledb

    And here is what was returned:

    exposing ports for container on podman, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database
    Exposing the ports for this network

    If you are starting your journey from the MacBook, its address would be 0.0.0.0 with a port of 43073. Data/info flows in and out of that port. And 1521 is a reference to the [bleep blurp ๐Ÿค–] TCP port at which the Transparent Network Substrate (TNS) Listener is located.

    Actually, if you look at the previous output (from the podman logs myoracledb command) you’ll see how all the addresses and ports connect (including the TNS Listener).

    tns listener in podman container, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database
    TNS Listener information
    It's in the logs, how could you not know this!?

    Honestly, this is all ludicrous. You shouldn’t know all this, nobody should! It’s too much to retain, but that’s okay, I’m glad you’re still here…toughing it out with me. Once you get past this first big hurdle, I imagine working with containers is very fun.


    Logging on with SQLcl, for real

    Here is where I actually logged on (or is it logged into?) to this database with SQLcl. Here’s the command I used:

    sql sys/password1234@//localhost:43073/ORCLCDB as sysdba

    Which, if you recall is modeled on the original command found in the Oracle Container Registry docs; it looks like this (it’s also a few paragraphs back):

    $ sqlplus sys/<your_password>@//localhost:<exposed_port>/<your_SID> as sysdba

    NOTE: Exposed port is that where the TNS Listener is located, and the SID is the “System Identifier” – I showed that in the database logs earlier.

    And again, I don’t think it matters if you include the localhost port. Here is what the output looked like in my Terminal:

    connecting via sqlcl to podman container, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database
    Connecting to the container with SQLcl

    Create and load a new table with SQLcl

    Alright, so finally, I’m in! Next, I tested a SQLcl function to see if it worked as expected. I chose the LOAD function. And just as a refresher on the LOAD function I referred to the SQLcl help for in-context assistance. That help looks like this:

    help and more load info, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database
    SQLcl Help and LOAD information

    Specifically, I am going to test out the “CREATE TABLE” and “LOAD TABLE” function. So I scrolled down to the examples for reference.

    create and load table function in sqlcl, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database copy
    Example showing how to create and load a table at the same time

    At this point, the commands are pretty straightforward. I decided to use the LOAD NEW command, as seen above.

    PRO TIP: You can simply drag the file and drop it into Terminal to get the complete file path.

    DON’T forget to include the “NEW” at the end of the command. I forgot it the first time and my load failed. If doesn’t break anything, just a silly mistake.

    I hit enter, and if you look at that image with the “Ready to execute” caption, everything worked as expected, here it is a zoomed-in (please excuse the gray shading):

    new table processed without errors, chris hoina, senior product manager, oracle database tools, oracle cloud, ords, oracle rest apis, sql developer, podman, sqlcl
    Processed without errors ๐Ÿซ 

    SQL queries on the newly created table

    Alright, so I have a brand new table. And if you recall, this was a data set that included the IMDb top 100 highest-rated movies of all time.

    kaggle imdb top 100 dataset, Chris Hoina, Senior Product Manager, ords, Database Tools, Oracle Database
    IMDb dataset by way of Kaggle

    Well in the next few images, I wanted to test some SQL on the new table. I first searched by genre, with the following SQL:

    SELECT DISTINCT genre FROM t100movies;
    selecting distinct from new table, chris hoina, senior product manager, oracle database tools, oracle cloud, ords, oracle rest apis, sql developer, podman, sqlcl
    Selecting by the movie genre

    Which returns all the distinct matches. Easy enough right? Then (because I like Adventure and Fantasy) I selected only those films that match those criteria, with this SQL statement:

    SELECT * FROM t100movies WHERE genre = 'Adventure, Fantasy';
    selecting specific genres and reviewing results, chris hoina, senior product manager, oracle database tools, oracle cloud, ords, oracle rest apis, sql developer, podman, sqlcl
    Single quotes and don’t forget your semi-colon ๐Ÿซก

    And once I correctly entered the SQL query, I found a single movie in the top 100 that meets those criteria. I’m actually surprised this one made the top 100 list.

    Data persists in your container

    one more thing, chris hoina, senior product manager, oracle database tools, oracle cloud, ords, oracle rest apis, sql developer, podman, sqlcl

    The files are in the [container]!

    Derek Z. and Hansel M.

    Okay, but there’s just one more thing. The data persists inside the container even after I’ve stopped it. This isn’t necessarily the focus of this article, but I just wanted to demonstrate that even after shutting everything down, the table still exists.

    This is true even after completely stopping my Podman container and shutting down the Podman Linux virtual machine.

    The process was as follows:

    • Exited out of the SQLcl application
    • Stopped the myoracledb container process
    • Checked to make sure the process was actually stopped
    • Stopped and then restarted the Podman Linux virtual machine
    • Restarted the myoracledb container
    • Executed the same SQL query as before
    • Exited from the SQLcl application a final time

    And if you take a look at all these images (they are numbered in order) you can see all the steps I took to during this little test. Pretty cool, eh!?

    A couple of notes here:

    • When I restarted the container, it only took about a minute for it to move from a “starting” to a “healthy” status. So I think the first time you start this container it takes a while. Subsequent start-ups, a minute or less.
    • When you start back up the container, you don’t have to map anything, I believe all those settings are still intact. Either that or I just spent a whole bunch of time networking when I didn’t need to.

    The end

    And that does bring us to a close. If you’ve made it this far, you understand how you can at least get this container started and log on using SQLcl. Remember you can use Brew to install SQLcl and Podman. And of course, you’ll need to get the container I used (from our Container Registry; you can find it here in the Database category).

    Remember, I didn’t do anything with checkpoints (or checkpointing containers) or with volumes. At the time of this article, I wasn’t 100% confident in my approach, so I wanted to exclude it. However, as I understand it, volumes (and their use) are the preferred approach to persisting data for later use (in containers). Just keep that in mind.

    Finally, I’m not going to sugarcoat it. This was a grind – it was very tedious and frustrating, so hopefully, you can learn from my mistakes. I’m not claiming to be a developer or an expert in CI/CD. But I can totally see the appeal of having portable containers like this. The barrier to understanding stuff like this is incredibly high, so good luck to you. But hey, if you screw up, don’t worry about it. You can always uninstall and reinstall and try again.

    Be sure to leave a comment if you see something amiss or if you have a better approach to something I’ve shown here. And as alwaysโ€ฆ

  • Where to learn SQL: a Slack message

    Where to learn SQL: a Slack message

    What follows is a response I sent via Slack to one of our newest UX Designers. She comes to us by way of another sister business unit within Oracle. She was looking for some resources on where to learn/get better acquainted with SQL (Which for a UX and/or UI designer, I think is a really impressive!).

    As I was putting this list together, the thought occurred to me, “hey, I think this stuff could be helpful to others outside of Oracle too!” So here we are. What follows (with minor edits) is a list of resources that I’ve found over the past year that have helped me to better understand SQL. Maybe you’ll discover another previously unknown resource.

    Resources

    In no particular order (seriously, these came to me at random):

    Update to the list. Thanks to Chris Saxon!

    Dev Gym

    dev gym learn sql at oracle, chris hoina, senior product manager, ords, oracle autonomous database
    Learn SQL with Dev Gym

    What is it? Verbatim, here is the “elevator pitch” from our site:

    Learn SQL in this FREE 12-part boot camp. It will help you get started with Oracle Database and SQL. The course is a series of videos to teach you database concepts, interactive SQL tutorials, and quizzes to reinforce the ideas. Complete the course to get your free certificate.

    Link to course

    Hey, pretty cool you end up with a free certificate too!

    NOTE: You'll need to create an Oracle account first. You can sign-up here. 

    O’Reilly

    oreilly welcome screen learning, chris hoina, senior product manager, oracle, ords, database actions, sql developer, sqlcl
    O’Reilly Welcome page

    Within Oracle, we have access to O’Reilly. You may, too, check internally. This is the second employer I’ve seen where this is available. It’s chock full of digital learning content – videos, tutorials, books, and guides. You can even create “Playlists” for similar topics, here are mine:

    oreilly playlists, chris hoina, senior product manager, oracle, ords, database actions, sql developer, sqlcl
    My O’Reilly playlists

    Live SQL

    Oracle Live SQL landing page, chris hoina, senior product manager, oracle, ords, database actions, sql developer, sqlcl
    Oracle Live SQL

    Oracle has a browser-based app, Live SQL, where you can learn all sorts of SQL. I don’t learn like this, but others might (I need skill acquisition to be more practical). If you learn through rote, then this is the site for you!

    SQL Worksheet in the Database Actions Launchpad

    sql worksheet in database actions, sql developer web, chris hoina, senior product manager, ords, oracle autonomous database
    SQL Worksheet via Database Actions

    Sign up for one of our OCI Free Tier accounts and create an Autonomous Database (ADB). After that you can get a feel for how Database Actions (aka SQL Developer Web) works and how to interact with your database.

    From there, if you want to look at SQL specifically, I would focus on the SQL Worksheet. Once there, you can practice simple SQL queries.

    Reader: I don't know your level, so you may already be more familiar with this than me. But it's free, so why not?

    LiveLabs

    Typing SQL in LiveLabs, chris hoina, senior product manager, ords, oracle autonomous database
    SQL learning in LiveLabs

    This is a straightforward and approachable entry point. Simply typing “sql” reveals tons of relevant workshops. LiveLabs home.

    Oracle SQL Language Guide

    This is the official guide for the current Oracle version 21 database. It would be a good thing to bookmark. But there is so much stuff; you’d want to skip sitting down and reading through it in one sitting.

    PL/SQL

    This is a PL/SQL language guide. I can only explain PL/SQL as “SQL Advanced.” It’s not better; it is a way to give you more control over when, how, and where to use SQL (my interpretation). Wikipedia does a better job of explaining. You won’t be using this initially. I’m just starting to get into it after a year. But the sooner you can use it, the better!

    W3 Schools

    Great for many languages (as well as CSS/HTML). It is a memory HOG, though! I don’t know what is happening (probably the ads), but at least on Firefox, your computer’s fans will be working double-time. So get in, get out; otherwise, your computer will slow to a crawl. Link to SQL topic.

    Errors/troubleshooting

    StackOverflow

    Using the error code (or parts of it) as a keyword in StackOverflow works quite well. Like this:

    Random Oracle error in Stack Overflow search, chris hoina, senior product manager, oracle, ords, database actions, sql developer, sqlcl
    Using a random error code as an example

    You can even create Watched Tags to keep up on topics that you are most interested in.

    Watched Tags in Stack Overflow, chris hoina, senior product manager, oracle, ords, database actions, sql developer, sqlcl
    Watched Tags in StackOverflow

    Link to StackOverflow.

    YouTube

    YouTube works well too (duh). Particularly this user!

    Nothing fancy about these two suggestions...I almost always defer to practical videos for learning/diagnosing issues.

    The End

    And that’s all I can think of for right now. Those should set you on the right path. Did I miss anything?

    Leave a comment for your go-to SQL resources, I’ll add them to the list!

  • An intro to using Oracle SQLcl on Mac

    An intro to using Oracle SQLcl on Mac

    Oracle SQLcl blurb, chris hoina, senior product manager, oracle database tools, command line, oracle autonomous database
    SQLcl…my newest crush?

    Did you know you can use Homebrew to install Oracle’s SQLcl on Mac? I just realized this about a week ago (always the bridesmaid, never the brideโ€ฆamirite??).

    Homebrew

    First you’ll need to install Homebrew (I’m sure there are other ways to install SQLcl, but installing through Homebrew was a breeze).

    You can install Homebrew on your Mac by first opening up a new terminal window and typing/entering:

    /bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"

    That Shell script should walk you through the setup.

    DISCLAIMER: I didn't go that route, but if you follow the directions on the Homebrew site I assume it should work.

    If you want a more hands-on approach, visit this site for a complete walk through of setting up your new Mac for application development. You may not need to do everything on that site, but read up on the Homebrew & Cask sections.

    Installing SQLcl

    I’ve since learned that you are really installing the SQLcl app via Cask (which is included in Homebrew). Cask allows the installation of “large binary files” (see the site from the paragraph above for more details). A list of the current Cask applications available.

    cask applications sqlcl now available, chris hoina, senior product manager, oracle database tools, command line, oracle autonomous database
    We’re giving Pi a run for its money with that semantic versioning…

    Once you are all updated with Homebrew, you can then open up a new terminal and enter the following:

    brew install sqlcl 

    As it installs, you’ll see a lot of activity in the terminal window. Once complete, you’ll see something that looks like this (I’ve already installed/reinstalled it tons of times, so there may be some slight difference):

    sqlcl install complete, chris hoina, senior product manager, oracle database tools, command line, oracle autonomous database
    Don’t forget to review the Caveats section!

    Caveats

    The main things to review are in the “Caveats” section. First, you’ll need Java 11+ or higher for this to work (i.e., connect to an Oracle database). I didn’t realize this, but we give you a command to update to the latest Java version. I wish I had known that, as I spent way too much time figuring out the best way to update.

    brew command for upgrading java, chris hoina, senior product manager, oracle database tools, command line, oracle autonomous database
    Upgrading Java through Homebrew

    Second, you’ll need to add a new line to your “PATH environment variable”.

    Line to be added to PATH environment variable, chris hoina, senior product manager, oracle database tools, command line, oracle autonomous database copy
    New line to be added to your PATH Environment Variable

    What is “PATH?”:

    PATH is an environment variable on Unix-like operating systems, DOS, OS/2, and Microsoft Windows, specifying a set of directories where executable programs are located. In general, each executing process or user session has its own PATH setting.

    Wikipedia.org

    I understand this, as specific applications will only work if you’ve predefined the locations of their dependencies. You can indicate where your operating system looks for these dependencies by updating the PATH Environment Variable (a separate file; more on this in a second). We have another excellent resource here (it explains PATH and CLASSPATH well).

    Locating PATH on Mac

    On a Mac, there are a couple of ways you can find PATH.

    PRO TIP: PATH export definitions are located in a .zprofile file.

    The easiest way (for me) to find this file is by typing/entering in a terminal window:

    open .zprofile 

    LEARN ZSH: Want to learn all there is about zsh , .zshenv, .zprofile, .zshrc or .zlogin? Bookmark this manual for future use.

    From there, your .zprofile file will appear in a new window. Mine looks like this:

    Initial zprofile file without new sqlcl line, chris hoina, senior product manager, oracle database tools, command line, oracle autonomous database
    A look at my .zprofile file.

    If you recall from the “Caveats” section, you may need to add a line to your PATH. I’ve already done that; I added a comment for reference (optional, but make sure the comment is preceded with a “#”).

    zprofile file for PATH environment variable, chris hoina, senior product manager, oracle database tools, command line, oracle autonomous database
    .zprofile file with the new line added.

    Remember to save (with CMD + S)! After which, you can close out the window.

    Also, it’s a good idea to close any active terminals and open a new one (this way your terminal picks up any changes you’ve made).

    You can also perform a check to see what is installed via Homebrew with the following command:

    brew list

    You’ll see something akin to this (depending on what you have installed):

    brew list printout, chris hoina, senior product manager, oracle database tools, command line, oracle autonomous database
    Use brew list to see current Homebrew installs.

    Dive into SQLcl

    Okay, now we are ready to explore SQLcl!

    DISCLAIMER: I'm not connecting to my database yet (I will be in my next post as I'm just working out the kinks on my Podman setupโ€ฆcontainers, baby!). 

    I’ll keep this next section simple. Begin with a new terminal and type/enter:

    sql -h 

    or

    sql -help 

    You’ll see the following printout:

    sqlcl initial help screen, chris hoina, senior product manager, oracle database tools, command line, oracle autonomous database
    Help printout.

    If you look closely, you’ll see information for Usage 1 and Usage 2.

    README: When in doubt, refer to the help!

    Usage 1

    Usage 1 – great for reviewing in-context help documentation as well as version information.

    Usage 1 sqlcl initial help screen, chris hoina, senior product manager, oracle database tools, command line, oracle autonomous database
    Usage 1 focus.

    Regarding help, I’ve tried the following (they all work):

    • sql -h
    • sql -help
    • sql -Help
    • sql -H
    • sql -HELP
    HINT: Type/enter exit into the command line to exit the SQLcl help screen.
    type exit to return to the shell, chris hoina, senior product manager, oracle database tools, command line, oracle autonomous database
    Using the exit command.

    Usage 2

    Usage 2 sqlcl initial help screen, chris hoina, senior product manager, oracle database tools, command line, oracle autonomous database
    Usage 2 focus.

    In Usage 2, you’ll find information for two login options:

    • Login with a “Connect Identifier”
    • No logon

    The Connect Identifier can be either:

    • “Net Service Name”
    • “Easy Connect”

    Wut r theez?

    I found some information relating to the “Net Service Name” method of connection; you can refer to that here. Be forewarned – there seems to be some configuration required to use the Net Service Name method (I’ve not tested this yet).

    Conversely, the Easy Connect Method looks well…easier. I found a good resource here. This was the method I used when experimenting with containers and Podman (blog coming soon!).

    Now, if you are like me and want to explore SQLcl (without connecting to an Oracle database), you can log in using the /NOLOG option. Make sure you exit out of the SQLcl help screen first.

    Once you’re out, type/enter the following command:

    sql /NOLOG 
    NOTE: Make sure you have a space between the "l" in sql and the "/" of /NOLOG. 

    Once you hit enter, you should see a screen like this:

    sqlcl nolog option, chris hoina, senior product manager, oracle database tools, command line, oracle autonomous database
    Logging in with the /NOLOG option.

    Unimpressive, right? Well, allow me to whet your appetite some. From here, you have two more options. Those are:

    • h
    • help

    Entering h will reveal a history of the most recent shell commands you’ve executed.

    sqlcl history, chris hoina, senior product manager, oracle database tools, command line, oracle autonomous database
    Shell command history.

    Type/enter help and you’ll; reveal a list of the available SQLcl commands and options. It looks like this:

    help screen once logged on with sqlcl, chris hoina, senior product manager, oracle database tools, command line, oracle autonomous database
    So. Many. Options.

    Pretty cool, eh?

    You can take this one step further by typing/entering a topic of interest. Here are a couple random topics I explored (ALIAS and MODELER):

    Final thoughts

    While I have yet to take full advantage of what SQLcl offers, I see the potential time savings for application developers who want to stay in a text editor while coding (without switching to another GUI application).

    I’ll include the SQLcl documentation so you have it for reference. But be forewarned we’re updating this document; some instructions may be changed.

    And check back in a week or two once I get Podman fully working with one of our Database Containers. I’ll test SQLcl, ORDS, and an Oracle Enterprise database 21.3.x (if you’re curious about our available containers, you can find them here).

    Want to learn even more about SQLcl? Check out these helpful resources:

    And finally…


  • Using Python Pandas to turn ISO Country Codes into a string to use as values for a SQL Query

    Using Python Pandas to turn ISO Country Codes into a string to use as values for a SQL Query

    Summary, code, resources

    Problem

    While querying a table (based on this dataset) with SQL, you realize one of your columns uses 3-character ISO Country Codes. However, some of these 3-character codes aren’t countries but geographical regions or groups of countries, in addition to the actual country codes. How can you filter out rows so you are left with the countries only?

    Answer

    Use the Python Pandas library to scrape ISO country codes and convert the values to one single string. Then use that string as values for a subsequent SQL query (possibly something like this):

    SELECT * FROM [your_table]
    WHERE country_code IN ([values from the generated list-as-string separated by commas and encased by single / double quotes]);

    Code

    # Libraries used in this code
    from bs4 import BeautifulSoup
    import requests
    import csv 
    import pandas as pd 
    
    # I found these ISO country codes on the below URL. Pandas makes it easy to read HTML and manipulate it. Very cool!
    iso_codes = pd.read_html("https://www.iban.com/country-codes")
    
    # I create a data frame, starting at an index of 0.
    df = iso_codes[0]
    
    # But really, all I care about is the 3-digit country code. So I'll make that the df (dataframe) and strip out the index
    df = df['Alpha-3 code'].to_string(index=False)
    
    # From here, I'll save this little guy as a text file.
    with open("./countries.txt", "w") as f:
        f.write(df)
    
    # I'll set up a list. *** This was my approach, but if you find a better way, feel free to comment or adjust. ***
    my_list = []
    
    # Then I'll open that text file and read it in.
    file = open("./countries.txt", "r")
    countries = file.read()
    
    # I need to remove the "new line" identifiers, so I'm doing that here. 
    my_list = countries.split('\n')
    
    # Once I do that, I can create two new strings. I do this with f-Strings. Great article on using them here: https://realpython.com/python-f-strings/ 
    
    # I have two options here: one where the codes are contained by single quotes, the other with double quotes. Oracle Autonomous Database likes single quotes, but your DB may differ.
    
    countries_string_single_quotes = ','.join(f"'{x}'" for x in my_list)
    
    countries_string_double_quotes = ','.join(f'"{x}"' for x in my_list)
    
    # From here, I take those strings and save them in a text file. You don't have to do this; you can print and copy/paste the string. But this might be an excellent addition if you want to refer to these later without running all the code. 
    
    with open("./countries_as_list_single_quotes.txt", "a") as f:
        f.write(countries_string_single_quotes)
    
    with open("./countries_as_list_double_quotes.txt", "a") as f:
        f.write(countries_string_double_quotes)

    GitHub repo details

    You can find the code from this post in my GitHub repository. The repository consists of the following:

    • The Python code I created for solving this problem
    • A countries.txt file, which is produced midway through the code (temporary placeholder for later processing)
    • ‘Single quotes’ .txt file – the 3-character ISO Country Codes are formatted as a string. The values are enclosed by single quotes; commas throughout
    • “Double quotes” .txt file – the 3-character ISO Country Codes are formatted as a string. The values are enclosed by double quotes; commas throughout

    Resources


    The in-depth walkthrough

    Backstory

    A few days ago, I stumbled upon this question in the r/datasets subreddit:

    I spent most of the morning figuring out how I would go about this, and after some trial and error, I devised a plan. I decided to take the list of ISO Country Codes (which I found here) and use them as values for filtering in a SQL statement (later on in Oracle SQL Developer Web).

    After some research, I figured out the proper SQL syntax for a successful query.

    SELECT * FROM [your_table]
    WHERE country_code IN ([values from the generated list-as-string separated by commas and encased by single / double quotes]);

    From there, I knew I needed to work backward on those ISO Country Codes. Meaning I needed to take something that looked like this:

    And turn it into something more workable. It turns out that grabbing this was pretty straightforward. I’m using Pandas primarily for this exercise, but first, I need to import some libraries:

    # Libraries used in this code
    from bs4 import BeautifulSoup
    import requests
    import csv 
    import pandas as pd 

    Next, I’ll use Pandas’ read_html function (this feels like cheating, but it’s incredible) to read in the table.

    # I found these ISO country codes on the below URL. Pandas makes it easy to read HTML and manipulate it. Very cool!
    iso_codes = pd.read_html("https://www.iban.com/country-codes")
    
    # I create a data frame, starting at an index of 0.
    df = iso_codes[0]

    This is wild, but this is what the printout looks like:

    pandas to_html printout of ISO codes table, chris hoina, senior product manager, oracle rest apis, oracle autonomous database, sql developer web, ORDS
    The Pandas read_html() the function is powerful.

    If you squint, you can see an “Alpha-2 code” and an “Alpha-3 code” column in the image. From here, I need to isolate the 3-code column. So I reshaped the data frame by making it a single column; dropping the index (this is optional, you could keep the index if you needed it; perhaps you wanted to create a separate table in your database).

    # But really, all I care about is the 3-digit country code. So I'll make that the df (dataframe) and strip out the index
    df = df['Alpha-3 code'].to_string(index=False)

    I’ll save this data frame as a .txt file.

    # From here, I'll save this little guy as a text file.
    with open("./countries.txt", "w") as f:
        f.write(df)

    This is only temporary (FYI: this is the only way I could figure out how to do this). It’ll look like this:

    country codes as a temporary text file, chris hoina, senior product manager, oracle rest apis, oracle autonomous database, sql developer web, ORDS
    The temporary .txt file of 3-character ISO Country Codes.

    Next, I take that temporary text file and read it in. I’m going to add it to a list, so I’ll first create the empty list (aptly named “my_list“). I also need to remove the newline characters from the list; otherwise, if I don’t, then when I create my string of values (that comes in the final step), the string will look like this:

    countries string with newline characters, chris hoina, senior product manager, oracle rest apis, oracle autonomous database, sql developer web, ORDS
    The “countries” string with “\n” characters.

    I remove the newline characters with this piece of code:

    # I need to remove the "new line" identifiers, so I'm doing that here. 
    my_list = countries.split('\n')

    The almost string of values will look like this:

    viewing ouput of my_list, chris hoina, senior product manager, oracle rest apis, oracle autonomous database, sql developer web, ORDS
    New line characters have now been removed.

    I use F-Strings to create the following two strings; countries_strings_single_quotes and countries_strings_double_quotes, respectively. Need to learn about F-Strings (or, more formally, Literal String Interpolation)? No problemo! Check out these three resources:

    The code for the F-Strings is below. I loop through my_list and separate the x (the things I’m iterating over) with commas (that’s the join).

    # Once I do that, I can create two new strings. I do this with f-Strings. Great article on using them here: https://realpython.com/python-f-strings/ 
    
    # I have two options here: one where the codes are contained by single quotes, the other with double
    # quotes. Oracle Autonomous Database likes single quotes, but your DB may differ.
    
    countries_string_single_quotes = ','.join(f"'{x}'" for x in my_list)
    
    countries_string_double_quotes = ','.join(f'"{x}"' for x in my_list)

    And now that I have these two objects (are they called objects??). I’ll save them each as a text file. One file has the 3-character codes surrounded by single quotes, the other with double quotes. The code:

    # From here, I take those strings and save them in a text file. You don't have to do this; you can print
    # and copy/paste the string. But this might be a nice addition if you want to refer to these later 
    # without running all the code. 
    
    with open("./countries_as_list_single_quotes.txt", "a") as f:
        f.write(countries_string_single_quotes)
    
    with open("./countries_as_list_double_quotes.txt", "a") as f:
        f.write(countries_string_double_quotes)

    The text files look like this now:

    side by side comparison of newly created text files, chris hoina, senior product manager, oracle rest apis, oracle autonomous database, sql developer web, ORDS
    The country codes are now presented in one long string. Pretty cool, eh?

    SQL time

    We have arrived! Let me show you what I can do now!

    I took the CSV data from the World Bank and loaded it into my Autonomous Database. Our returning intern Layla put together a video of how to do this; you can check it out here:

    Once my table was created, I did a SELECT [columns] FROM. Here you can see my “beginning state”.

    There are 266 entries; some are countries, and others are not. And if you recall, the original question asked how somebody could filter out the non-countries. Onto that next!

    This is the best part. I can take the string I made and use that in a SQL query such as this:

    SELECT * from ADMIN.REDDIT_TABLE
    WHERE COUNTRY_CODE IN('AFG','ALA','ALB','DZA','ASM','AND','AGO','AIA','ATA',
    'ATG','ARG','ARM','ABW','AUS','AUT','AZE','BHS','BHR','BGD','BRB','BLR','BEL',
    'BLZ','BEN','BMU','BTN','BOL','BES','BIH','BWA','BVT','BRA','IOT','BRN','BGR',
    'BFA','BDI','CPV','KHM','CMR','CAN','CYM','CAF','TCD','CHL','CHN','CXR','CCK',
    'COL','COM','COD','COG','COK','CRI','CIV','HRV','CUB','CUW','CYP','CZE','DNK',
    'DJI','DMA','DOM','ECU','EGY','SLV','GNQ','ERI','EST','SWZ','ETH','FLK','FRO',
    'FJI','FIN','FRA','GUF','PYF','ATF','GAB','GMB','GEO','DEU','GHA','GIB','GRC',
    'GRL','GRD','GLP','GUM','GTM','GGY','GIN','GNB','GUY','HTI','HMD','VAT','HND',
    'HKG','HUN','ISL','IND','IDN','IRN','IRQ','IRL','IMN','ISR','ITA','JAM','JPN',
    'JEY','JOR','KAZ','KEN','KIR','PRK','KOR','KWT','KGZ','LAO','LVA','LBN','LSO',
    'LBR','LBY','LIE','LTU','LUX','MAC','MKD','MDG','MWI','MYS','MDV','MLI','MLT',
    'MHL','MTQ','MRT','MUS','MYT','MEX','FSM','MDA','MCO','MNG','MNE','MSR','MAR',
    'MOZ','MMR','NAM','NRU','NPL','NLD','NCL','NZL','NIC','NER','NGA','NIU','NFK',
    'MNP','NOR','OMN','PAK','PLW','PSE','PAN','PNG','PRY','PER','PHL','PCN','POL',
    'PRT','PRI','QAT','REU','ROU','RUS','RWA','BLM','SHN','KNA','LCA','MAF','SPM',
    'VCT','WSM','SMR','STP','SAU','SEN','SRB','SYC','SLE','SGP','SXM','SVK','SVN',
    'SLB','SOM','ZAF','SGS','SSD','ESP','LKA','SDN','SUR','SJM','SWE','CHE','SYR',
    'TWN','TJK','TZA','THA','TLS','TGO','TKL','TON','TTO','TUN','TUR','TKM','TCA',
    'TUV','UGA','UKR','ARE','GBR','UMI','USA','URY','UZB','VUT','VEN','VNM','VGB',
    'VIR','WLF','ESH','YEM','ZMB','ZWE')
    ORDER BY COUNTRY_CODE ASC;

    Once I execute that SQL statement, I’m left with the countries from that list. I opened up the results in another window so you can see a sample.

    The end

    So yeah, that’s it! I don’t know if this was the best way to go about this, but it was fun. I’m curious (if you’ve made it this far), what do you think? How would you go about it? Let me know.

    And two more things: remember to share this andโ€ฆ