Development

Unity and Difference

Greg Pavlik - Wed, 2020-06-03 09:43
One of the themes that traveled from Greek philosophy through until the unfolding of modernity was the neoplatonic notion of "the One". A simple unity in which all "transcendentals" - beauty, truth, goodness - both originate and in some sense coalesce. In its patristic and medieval development, these transcendentals were "en-hypostasized" or made present in persons - the idea of the Trinity, where a communion of persons exist in perfect love, perfect peace and mutual self-offering: most importantly, a perfect unity in difference. All cultures have their formative myths and this particular myth made its mark on a broad swath of humanity over the centuries - though I think in ways that usually obscured its underlying meaning (unfortunately).

Now I have always identified with this comment of Dostoevsky: "I will tell you that I am a child of this century, a child of disbelief and doubt. I am that today and will remain so until the grave": sometimes more strongly than others. But myths are not about what we believe is "real" at any point in time. The meaning of these symbols I think says something for all of us today - particularly in the United States: that the essence of humanity may be best realized in a unity in difference that can only be realized through self-offering love. In political terms we are all citizens of one country and our obligation as a society is to care for each other. This much ought to be obvious - we cannot exclude one race, one economic class, one geography, one party, from mutual care. The whole point of our systems, in fact, ought to be to realize, however imperfectly, some level of that mutual care, of mutual up-building and mutual support.

That isn't happening today. Too often this we are engaged in the opposite - mutual tearing down and avoiding our responsibilities to each other. I wish there was a magic fix for this: it clearly has been a problem that has plagued our history for a long, long time. The one suggestion I can make is to find a way to reach out across boundaries with care on a day by day basis. It may seem like a person cannot make a difference. No individual drop of rain thinks it is responsible for the flood.

Introduction to Apache Kafka

Gerger Consulting - Tue, 2020-06-02 13:03
Apache Kafka is a product that should be in every IT professional's toolbox.

Attend the webinar by developer advocate Ricardo Ferreira and learn how to use Kafka in your daily work


About the Webinar:

The use of distributed streaming platforms is becoming increasingly popular among developers, but have you ever wondered why?
Part Pub/Sub messaging system, partly distributed storage, partly CEP-type event processing engine, the usage of this type of technology brings a whole new perspective on how developers capture, store, and process events. This talk will explain what distributed streaming platforms are and how it can be a game changer for modern data architectures. We'll discuss the road in IT that led to the need of this type of platform, the current state of Apache Kafka, as well as scenarios where this technology can be implemented.


About the Presenter:
Ricardo is a Developer Advocate at Confluent, the company founded by the original co-creators of Apache Kafka. He has over 22 years of experience working with software engineering and specializes in service-oriented architecture, big data, cloud, and serverless architecture. Prior to Confluent, he worked for other vendors, such as Oracle, Red Hat, and IONA Technologies, as well as several consulting firms. While not working he enjoys grilling steaks on his backyard with his family and friends, where he gets the chance to talk about anything that is not IT related. Currently, he lives in Raleigh, North Carolina, with his wife, and son.
Categories: Development

Machine Learning for Oracle Professionals

Gerger Consulting - Tue, 2020-05-26 17:00

In this webinar, Oracle ACE Director Craig Shallahamer will introduce you to the world of applied Machine Learning from an Oracle Professional (DBA/Developer/Manager) perspective. This includes understanding what ML is, why use it and why now.

He will demonstrate how to create an automated anomalous performance detection system. He'll use industry standard Python with its ML libraries and Jupyter Notebooks. You will be able to download and do everything he does in this webinar!


If you have ever wondered how ML can be applied in an IT environment, you don't want to miss this webinar.



Categories: Development

How to Improve Oracle Performance with NVM

Gerger Consulting - Mon, 2020-05-18 13:26
Non-Volatile Memory hardware is a technology that every Oracle customer must know about. Attend the webinar by Yaron Dar and find out how you can improve the performance of yoru Oracle database with NVM.

Covered topics:
  • - Proof points, best practices, and guidelines for achieving peak performance for Oracle workloads with NVMe and Storage Class Memory.
  • - Maintaining high availability through disasters with Oracle Extended RAC (demo)
  • - Achieving amazing data reduction and storage efficiency for Oracle databases.

Register at: http://www.prohuddle.com/webinars/yarondar/oracle_performance.php


Categories: Development

Centrally Managed Users in Oracle (MS Active Directory)

Gerger Consulting - Tue, 2020-05-05 18:09
Attend the webinar by Oracle ACE Simon Pane and learn how to integrate Microsoft Active Directory with the Oracle database.

About the Webinar
This presentation details the advantages of using Centrally Managed Users (CMU) over the previous technologies, covers the installation/set-up process, and gives practical examples and use cases for organizations to leverage and benefit from the simplified integration of database user management (authentication and authorization) with Active Directory (AD).
An exciting new feature of Oracle Database 18c and 19c is "Centrally Managed Users" (CMU). This, for the first time, allows for simple and complete user management integration with Active Directory. No additional licenses required!
Previously, Oracle Enterprise User Security (EUS) and Oracle Unified Directory (OUD) was needed as an intermediate component bridging the Oracle Database to Active Directory. While EUS/OUD is still present and does offer some additional features and benefits, it's no longer required for database-AD user integration and management.

This webinar is sponsored by Gitora, the source control tool for the Oracle database.
Categories: Development

Practical Oracle SQL Webinar by ACE Director Kim Berg Hansan

Gerger Consulting - Wed, 2020-04-08 14:26
ACE Director Kim Berg Hansen is the author of the book "Practical Oracle SQL, Mastering the Full Power of Oracle Database".



Kim Berg Hansen is known to present complicated SQL features in a very accessible way to developers so that they can apply these features in their daily work.

In this webinar, Kim will present several SQL techniques, taken from his new book Practical Oracle SQL, and show you how you can apply them in real life scenarios.

Kim will cover the following topics:
  • Tree Calculations with Recursion (Recursive subquery factoring)
  • Functions Defined Within SQL (Functions in the WITH clause)
  • Answering Top-N Questions (Analytic ranking functions)
  • Rolling Sums to Forecast Reaching Minimums (Analytic window clause, recursive subquery factoring, model clause)
  • Merging Date Ranges (Row pattern matching MATCH_RECOGNIZE)
Categories: Development

Oracle Database + Git = Gitora 4

Gerger Consulting - Thu, 2020-04-02 15:17

We are happy to announce the new version of Gitora, the version control tool for the Oracle Database.

Gitora 4 enables you to manage your entire Oracle code base with Git. New features in Gitora 4 include:
  • Push to and pull from repositories in GitHub, GitLab, Bitbucket etc...
  • Simplified workflow and vastly improved working directory structure
  • SQL Developer Extension (Coming soon.)


Categories: Development

PL/SQL functions and statement level consistency

XTended Oracle SQL - Sun, 2019-12-29 18:40

You may know that whenever you call PL/SQL functions from within SQL query, each query in the function is consistent to the SCN of its start and not to the SCN of parent query.

Simple example:

create table test as 
  select level a, level b from dual connect by level<=10;

create or replace function f1(a int) return int as
  res int;
begin
  select b into res 
  from test t 
  where t.a=f1.a;
  dbms_lock.sleep(1);
  return res;
end;
/

As you can see we created simple PL/SQL function that returns the result of the query select b from test where a=:input_var

But lets check what will it return if another session changes data in the table:

-- session 2:
begin
    for i in 1..30 loop
      update test set b=b+1;
      commit;
      dbms_lock.sleep(1);
    end loop;
end;
/
-- session 1:
SQL> select t.*, f1(a) func from test t;

         A          B       FUNC
---------- ---------- ----------
         1          1          1
         2          2          3
         3          3          5
         4          4          7
         5          5          9
         6          6         11
         7          7         13
         8          8         15
         9          9         17
        10         10         19

10 rows selected.

As you can see we got inconsistent results in the column FUNC, but we can easily fix it using OPERATORs:

CREATE OPERATOR f1_op
   BINDING (INT) 
   RETURN INT 
   USING F1;

Lets revert changes back and check our query with new operator now:

--session 1:
SQL> update test set b=a;

10 rows updated.

SQL> commit;

Commit complete.

-- session 2:
begin
    for i in 1..30 loop
      update test set b=b+1;
      commit;
      dbms_lock.sleep(1);
    end loop;
end;
/

-- session 1:
SQL> select t.*, f1(a) func, f1_op(a) op from test t;

         A          B       FUNC         OP
---------- ---------- ---------- ----------
         1          2          2          2
         2          3          5          3
         3          4          8          4
         4          5         11          5
         5          6         14          6
         6          7         17          7
         7          8         20          8
         8          9         23          9
         9         10         26         10
        10         11         29         11

10 rows selected.

As you can see values in the column OP are equal to the values of B, while, in turn, function F1 returns inconsistent values.

Categories: Development

Just short note for myself: OJPPD limitations

XTended Oracle SQL - Mon, 2019-12-02 06:26

As of Oracle 19c OJPPD doesn’t support connect-by and TABLE():

OJPPD: OJPPD bypassed: query block contains START WITH/CONNECT BY.
OJPPD: OJPPD bypassed: View contains TABLE expression.
Categories: Development

Free Oracle Cloud: 15. The request could not be mapped to any database

Dimitri Gielis - Wed, 2019-11-06 13:25
This post is the last post of a series of blog posts on the Best and Cheapest Oracle APEX hosting: Free Oracle Cloud.

At some point you might face the following message: "The request could not be mapped to any database":


Oracle is monitoring usage on your Always Free Account and whenever it finds there's no activity for 7 days, it will stop your database automatically. It will preserve the data in the database, but it won't be accessible anymore.

To fix the issue, log in to your Oracle Cloud Account and go to your Autonomous Database:


You will see the database is in a stopped state. Click the Start button:


The state will change to Starting...


And after a minute it becomes available again:


The above behavior is written in the end-user documentation:
Inactivity Monitoring and Database Stoppage
Persistently inactive Always Free Autonomous Databases are detected and handled as follows:
  • After being inactive for 7 days, the database will be stopped automatically, preserving its stored data. Inactivity measurements leading up to 7 days are based on database connections. Successfully making a SQL*Net or HTTPS connection resets these measurements to zero.
  • A database that is automatically or manually stopped and stays inactive for 90 days, cumulative, may be reclaimed and permanently deleted. Inactivity measurements leading up to 90 days are based on the database being inactive or in the stopped state. Starting a stopped database resets these measurements to zero.
    Start an Always Free Autonomous Database by clicking the Start button on the Oracle Cloud Infrastructure console. Start a stopped Always Free Autonomous Database before 90 days to avoid losing access to its data.

But this week there were some people complaining that although they had activity, their database was stopped anyway. I witnessed the same behavior in my account, so I reached out to Oracle and they confirmed their code to identify inactivity, is not properly accounting for APEX/ORDS usage. They are already working on a fix, which they hope to apply very soon. I will update this post when I get confirmation the fix is in the data centers.

Categories: Development

Native Oracle DB JSON functionality as alternative for using cursor() in AOP (and APEX_JSON)

Dimitri Gielis - Sun, 2019-10-27 14:46
When using external (WEB/REST) services, you often communicate in JSON. So it's important to be able to generate JSON in the format that is expected by the external service.

In the case of APEX Office Print (AOP), we made it super simple to communicate with the AOP server from the database through our PL/SQL API. You just have to enter a SQL statement and the AOP PL/SQL API, which uses APEX_JSON behind the scenes, generates the necessary JSON that the AOP Server understands.

Here's an example of the Order data in JSON: a customer with multiple orders and multiple order lines:

As we are living in the Oracle database, we have to generate this JSON. The data is coming from different tables and is hierarchical. In SQL you can create hierarchical data by using the cursor() syntax.

Here's an example of the SQL statement that you would typically use in AOP (the cursor highlighted in red):

select
  'file1' as "filename", 
  cursor(
    select
      c.cust_first_name as "cust_first_name",
      c.cust_last_name  as "cust_last_name",
      c.cust_city       as "cust_city",
      cursor(select o.order_total      as "order_total", 
                    'Order ' || rownum as "order_name",
                cursor(select p.product_name as "product_name", 
                              i.quantity     as "quantity",
                              i.unit_price   as "unit_price"
                         from demo_order_items i, demo_product_info p
                        where o.order_id = i.order_id
                          and i.product_id = p.product_id
                      ) "order_lines"
               from demo_orders o
              where c.customer_id = o.customer_id
            ) "orders"
    from demo_customers c
    where customer_id = 1
  ) "data"
from dual

From AOP 19.3 onwards, the AOP PL/SQL API not only supports this cursor() syntax but also the native JSON functionality of the Oracle Database (version 12c and upwards).

The query above can also be written as the following using JSON support in the Oracle Database:

select 
  json_arrayagg( 
    json_object( 
      'filename' value 'file1', 
      'data'     value (
          select 
            json_arrayagg(
              json_object( 
                'cust_first_name' value c.cust_first_name, 
                'cust_last_name'  value c.cust_last_name,
                'cust_city'       value c.cust_city, 
                'orders'          value (
                    select 
                      json_arrayagg(
                        json_object(                               
                          'order_total' value o.order_total, 
                          'order_name'  value 'Order ' || rownum,
                          'order_lines' value (
                              select 
                                json_arrayagg(
                                  json_object(                               
                                    'product_name' value p.product_name, 
                                    'quantity'     value i.quantity,
                                    'unit_price'   value i.unit_price
                                  )
                                returning clob)      
                                from demo_order_items i, demo_product_info p
                               where o.order_id = i.order_id
                                 and i.product_id = p.product_id
                            )
                        )
                      returning clob)      
                      from demo_orders o
                    where o.customer_id = c.customer_id
                  )
              )
            returning clob)  
            from demo_customers c
            where c.customer_id = 1
          )
    )
  returning clob) as aop_json
  from dual 

You have to get used to this syntax and have to think a bit differently. Unlike the cursor syntax where you define the column first and give it an alias, using the JSON functions, you define the JSON object and attributes first and then map it to the correct column.

I find the cursor syntax really elegant, especially in combination with APEX_JSON, it's a really cool solution to generate the JSON you need. But I guess it's a personal choice what you prefer and I must admit, the more I use the native JSON way, the more I like it. If performance is important you most likely want to use native database functionality as much as possible, but I go in more detail further in this post. Lino also found an issue with the cursor syntax in the Oracle Database 19c, so if you are on that database release you want to look at the support document.

Before I move on with my test case, if you need more info on JSON in the database: Carsten did a nice blog post about parsing JSON in APEX, and although it's about parsing JSON and this blog post is more about generating JSON, the conclusions are similar. You can read more about APEX_JSON and the native JSON database functions in Tim's write-up on Oracle-Base.

As I was interested in the performance of both implementations, I run a few test cases. There are different ways to test performance, e.g. use dbms_profiler, Method R Workbench, trace, timing the results, ... Below I use Tom Kyte's script to compare two PL/SQL implementations. The interesting thing with the script it's not only comparing timings but also latches, which give you an idea of how hard the database has to work. You can download it from AskTom under the resources section:


Here's my test script:

declare
  l_sql             clob;
  l_return          blob;
  l_output_filename varchar2(100);  
  l_runs            number(5) := 1;
begin
  runStats_pkg.rs_start;
  
  -- sql example with cursor
  for i in 1..l_runs
  loop
      l_output_filename := 'cursor';
      l_sql := q'[
                select
                'file1' as "filename",
                cursor
                (select 
                    c.cust_first_name as "cust_first_name",
                    c.cust_last_name  as "cust_last_name",
                    c.cust_city       as "cust_city"
                   from demo_customers c
                  where c.customer_id = 1 
                ) as "data"
                from dual   
               ]';
      l_return := aop_api_pkg.plsql_call_to_aop (
                    p_data_type       => aop_api_pkg.c_source_type_sql,
                    p_data_source     => l_sql,
                    p_template_type   => aop_api_pkg.c_source_type_aop_template,
                    p_output_type     => 'docx',
                    p_output_filename => l_output_filename,
                    p_aop_remote_debug=> aop_api_pkg.c_debug_local); 
  end loop;  
  
  runStats_pkg.rs_middle;  
  
  -- sql example with native JSON database functionality
  for i in 1..l_runs
  loop
      l_output_filename := 'native_json';
      l_sql := q'[
                select 
                  json_arrayagg( 
                    json_object( 
                      'filename' value 'file1', 
                      'data'     value (select 
                                          json_arrayagg(
                                            json_object( 
                                              'cust_first_name' value c.cust_first_name, 
                                              'cust_last_name'  value c.cust_last_name,
                                              'cust_city'       value c.cust_city 
                                            )
                                          )  
                                          from demo_customers c
                                         where c.customer_id = 1
                                       )  
                    )
                  ) as aop_json
                  from dual 
               ]';
      l_return := aop_api_pkg.plsql_call_to_aop (
                    p_data_type       => aop_api_pkg.c_source_type_sql,
                    p_data_source     => l_sql,
                    p_template_type   => aop_api_pkg.c_source_type_aop_template,
                    p_output_type     => 'docx',
                    p_output_filename => l_output_filename,
                    p_aop_remote_debug=> aop_api_pkg.c_debug_local);                     
  end loop;    

  runStats_pkg.rs_stop;   
end;
/

I ran the script (with different l_runs settings) a few times on my 18c database and with the above use case on my system, the native JSON implementation was consistently outperforming the cursor (and APEX_JSON) implementation.

Run1 ran in 3 cpu hsecs
Run2 ran in 2 cpu hsecs
run 1 ran in 150% of the time

Name                                  Run1        Run2        Diff
STAT...HSC Heap Segment Block           40          41           1
STAT...Heap Segment Array Inse          40          41           1
STAT...Elapsed Time                      4           3          -1
STAT...CPU used by this sessio           4           3          -1
STAT...redo entries                     40          41           1
STAT...non-idle wait time                0           1           1
LATCH.simulator hash latch              27          26          -1
STAT...non-idle wait count              13          12          -1
STAT...consistent gets examina          41          43           2
LATCH.redo allocation                    1           3           2
STAT...active txn count during          21          23           2
STAT...cleanout - number of kt          21          23           2
LATCH.transaction allocation             1           3           2
LATCH.In memory undo latch               1           3           2
LATCH.JS Sh mem access                   1           3           2
STAT...consistent gets examina          41          43           2
LATCH.keiut hash table modific           3           0          -3
STAT...calls to kcmgcs                  64          69           5
STAT...dirty buffers inspected           6           0          -6
STAT...workarea executions - o           2          12          10
STAT...free buffer requested            71          52         -19
STAT...lob writes unaligned             80          60         -20
STAT...lob writes                       80          60         -20
STAT...sorts (rows)                      0          20          20
STAT...execute count                    91          71         -20
STAT...sorts (memory)                    0          20          20
LATCH.active service list                0          25          25
STAT...consistent gets                 183         156         -27
STAT...consistent gets from ca         183         156         -27
STAT...consistent gets pin (fa         142         113         -29
STAT...consistent gets pin             142         113         -29
STAT...lob reads                       160         130         -30
LATCH.JS queue state obj latch           0          42          42
LATCH.object queue header oper         151         103         -48
STAT...workarea memory allocat          66          -6         -72
STAT...db block changes                431         358         -73
STAT...consistent changes              390         315         -75
LATCH.parameter table manageme          80           0         -80
STAT...undo change vector size       8,748       8,832          84
LATCH.enqueue hash chains                1          88          87
STAT...parse count (total)             100          10         -90
STAT...session cursor cache hi         171          71        -100
STAT...opened cursors cumulati         171          71        -100
STAT...free buffer inspected           126           0        -126
STAT...calls to get snapshot s         470         330        -140
STAT...db block gets from cach         958         744        -214
STAT...hot buffers moved to he         220           0        -220
STAT...redo size                    12,016      12,248         232
STAT...db block gets                 1,039         806        -233
STAT...db block gets from cach       1,029         796        -233
STAT...session logical reads         1,222         962        -260
STAT...file io wait time             5,865       6,279         414
STAT...recursive calls                 561         131        -430
LATCH.cache buffers chains           3,224       2,521        -703
STAT...session uga memory          196,456           0    -196,456
STAT...session pga memory        1,572,864           0  -1,572,864
STAT...logical read bytes from   9,928,704   7,798,784  -2,129,920

Run1 latches total versus runs -- difference and pct
        Run1        Run2        Diff       Pct
       3,853       3,180        -673    121.16%

There are many different iterations of this test, using bind variables, etc. It seems "logical" that a native DB implementation is better performance-wise than a combination of PL/SQL (APEX_JSON) and SQL (cursor). But I always recommend you just run the test in your own environment. What is true today, might be different tomorrow and a lot comes into play, so if there's one thing I learned from Tom Kyte, it's don't take things for granted, but test in your unique situation.

So, in real life using AOP, will you see a big difference? It depends on the complexity of your SQL statement and data, how many times you call a report etc. but my guess is, in most cases, it's probably not much of a difference in user experience.

A simple test would be to do "set timing on" and compare the implementations:


Or if you are using AOP on an Oracle APEX page, you can run your APEX page in Debug mode and you will see exactly how long the generation of the JSON took for the data part:


Happy JSON'ing :)

Categories: Development

Free Oracle Cloud: 14. Your Oracle Cloud Free Trial has expired (but FREE still running)

Dimitri Gielis - Thu, 2019-10-17 16:19
This post is the last post of a series of blog posts on the Best and Cheapest Oracle APEX hosting: Free Oracle Cloud.

Today I got an email that my Oracle Cloud account was Expired. While I have an Always FREE Oracle Cloud, when I signed up I also got some extra credits that lasted for a month. Those credits are no longer valid.


When you log in to your Oracle Cloud Dashboard you will get a notification on top too, but nothing to worry about.


It has some consequences tho, on the menu, some options are grayed out. The one I actually use is the Email Delivery, which seems to be grayed out too although normally you should be able to send 1,000 emails per month. So maybe grayed out also means, not full service.


When I checked it out, it said it's part of the paid plan. I remember some discussions at Oracle Open World where they recommend upgrading to a Paid account, but as you only use the Always FREE services, you are not charged.


So I decided to upgrade to a Paid account: Pay As You Go:


You have to provide a Credit Card, but that was a bit of an issue for me. Apparently, Safari is not really working well with this screen, so I switched to Chrome. The next hick-up I had was when I added my AMEX card... it said it was an invalid card.


Then I used my VISA card and that seemed to work well:

Click the Start Paid Account:


Finally, it will say your payment method will be reviewed and after that you are live.


It wasn't immediately clear for me I had to wait for the confirmation email, but when I went to Payment Method again, I saw the review was still in progress:


And a few minutes later I got the email that my account was upgraded:


When you look at your Oracle Cloud Dashboard, there's a cost calculator, so you see how much you have to pay. As long as I use the Always FREE components, I expect the amount to stay 0 :)


But the nice thing now is that you have access to all of Oracle Cloud again (e.g. Email Delivery).
Categories: Development

OGB Appreciation Day: add an error in a PL/SQL Process to the inline notification in Oracle APEX

Dimitri Gielis - Thu, 2019-10-10 12:01
This post is part of the OGB (Oracle Groundbreakers) Appreciation Day 2019, a thank you to everyone that makes the community great, especially those people that work at keeping us all moving!

Before I give my tip on how to add an error message from your PL/SQL code in your Page Process to a notification message in Oracle APEX, I want to start with thanking some people.

What keeps me going are a few things:

  • The innovations of technology and more specifically the Oracle Database, ORDS, and Oracle APEX. I want to thank all the developers and the people behind those products. They allow me to help other people with the tools they create and keep on learning about the new features that are released. 
  • I want to thank the fantastic #orclapex (Oracle APEX) and Groundbreakers community. I believe we are a great example of how people help and support each other and are motivated to bring the technology further. Over time I got to know a lot of people, many I consider now friends.
  • I want to thank you because you read this, show your appreciation and push me forward to share more. I'm passionate about the technology I use. I love helping people with my skill set of developing software and while I learn, share my knowledge through this blog. 

So back to my tip of today... how do you show a message in the notification on a page?


You can do that with the APEX_ERROR PL/SQL API.


To see the usage yourself, create an empty page, with one region and a button that submits the page.
In the Submit Process, simulate some PL/SQL Code where you raise an error.

For example:


That's it! Now you can get your errors in the notification message area of your Oracle APEX Page.

Categories: Development

Free Oracle Cloud: 13. Final things to take away

Dimitri Gielis - Thu, 2019-10-10 04:42
This post is the last post of a series of blog posts on the Best and Cheapest Oracle APEX hosting: Free Oracle Cloud.

By now we have seen how you can set up the different components from the Always Free Oracle Cloud.

During Oracle Open World I talked to the people behind the Always Free Oracle Cloud, and they told me that when your account is inactive for a specified amount of time (I forgot if it's 5 days, or a week or more?), your instance is being backed-up to the Object Storage. You can see it as a VM which is being put in stand-by or halted and saved to disk. When you need it again, it can be restored, but it takes time and it might be annoying when you don't know this is what is happening.

If you have a production app running in the Fee Oracle Cloud, be sure people use your app at least once inside the window Oracle foresees. Maybe in the future, Oracle could foresee a setting where we can specify the (in-)activity window as a developer.

I'm really impressed by this free offering of Oracle and see many use cases for development environments and small to midsize applications. I believe the limits we get in the free plan are really generous of Oracle and much more than any other cloud provider. 
Here's a quick overview of what it looks like at the time of writing:
  • 2 Autonomous Databases, each with 1 OCPU and 20 GB storage
  • 2 Compute virtual machines, each with 1/8 OCPU and 1 GB memory
  • Storage:  2 Block Volumes, 100 GB total. 10 GB Object Storage. 10 GB Archive Storage.
  • Additional Services:  Load Balancer, 1 instance, 10 Mbps bandwidth. Monitoring, 500 million ingestion data points, 1 billion retrieval data points. Notifications, 1 million delivery options per month, 1,000 emails sent per month. Outbound Data Transfer, 10 TB per month.
So what if you outgrow these limits? It means your applications are successful, so you can be proud of that :) and at that time hopefully, there's enough revenue to upgrade to a Paid Oracle Cloud plan. This can be done super easy... you click the upgrade to the paid plan button and there you go!
Oracle will copy your DB, instance, ... and you go from there.

The way that Oracle is doing the upgrade is really cool, as it means you keep your free instance. So I see myself doing some development on the free instance, then for production upgrade to a paid plan. At that time I still have the development environment. The other free service could be the TEST environment, so you have DEV, TEST both free and PROD paid.


If you didn't check it out by now, go and try out the FREE Oracle Cloud yourself by going to https://www.oracle.com/cloud/free/ :)


Thanks Oracle!
Categories: Development

Free Oracle Cloud: 12. Create a 2nd Compute Instance and a Load Balancer

Dimitri Gielis - Fri, 2019-10-04 11:51
This post is part of a series of blog posts on the Best and Cheapest Oracle APEX hosting: Free Oracle Cloud.

In my blog post Create a VM Instance (Compute Cloud) we created a VM instance in the Free Oracle Cloud. The cool thing is that you get two VMs for free. In this post, we will set up the other always free compute instance.

Just like when we created our first instance, hit the Create a VM instance:


Give your instance a name and before I just hit the Create button, BUT this time you want to create the Show Shape, Network and Storage Options first:


The most important part of that screen is the "Assign public IP address" section. If you don't need this Compute instance to be accessible from the internet you can ignore it, but if you want to host a website, for example, you might want to check it. If you didn't do it, you can always add a public IP later, but I personally found it cumbersome and hard to understand the network piece. I had to do many different steps to get it to work to have an internet connection to that machine, while when you have a public IP address, Oracle does everything for you... anyway, it depends on your use case what you need, but I do want to highlight it. Also, it seems that the default changed from when I wrote the first post; by default, you don't have a public IP address. It might be that Oracle is trying to push you to use a Load Balancer (see later on in this blog post) and that might actually make sense.



Clicking the Create button will show that your instance is being provisioned.


When you go back to the overview you should see both of your Always Free Compute instances:


Clicking on the name, you will get the details. This screenshot shows when you don't specify a public IP address.


To access that machine, as it doesn't have a public IP, I connected to my first instance and from there, as I am on the subnet, I can connect to the Private IP Address:


An alternative for a URL to go directly to your VM instance is to front it with a Load Balancer.

Which brings us to the Load Balancer topic. With the Always Free Oracle Cloud, we also get a Load Balancer for free. There are different use cases for using a Load Balancer, but here are my own reasons why I have used a Load Balancer before:

  1. Distribute the traffic automatically over different machines. For example, when you use our APEX Office Print (AOP) Cloud you will actually hit our load balancer, behind the load balancer we have two to five different machines. It's not only to handle the large number of prints we get, but it also makes our lives easier when we want to upgrade without downtime. We upgrade one clone instance, and when done, new machines are brought online and old ones are shutdown. We patch our own service with zero downtime.
  2. The Load Balancer has the SSL certificate and handles the HTTPS requests while the backend servers have HTTP.
  3. On a Load Balancer, you have integrated health checks, so you can be warned when things go wrong, even when there's only one server behind the Load Balancer.

So lets get started to set up a Load Balancer in the Oracle Cloud:

Click on Networking > Load Balancers:


Click the Create Load Balancer button:


It will ask for a name and type. For the Always free instance, use Micro with Maximum Total Bandwidth.
By default Small is selected, so don't forget to change it:


Next you want to add a Backend to this Load Balancer, so click the Add Backends button:


In the pop-up you can select the instances you want to put behind this Load Balancer:


Furthermore, on the screen you can select a Health Check Policy:


In the next step, you can upload the SSL certificate, in case you want the Load Balancer to be accessible through HTTPS. You can also choose to just configure the Load Balancer for HTTP (which I don't recommend):


Hit the Create Load Balancer and you will get an overview that the Load Balancer is being created:


Once it's ready the icon turns green and you will see the Public IP Address of your Load Balancer:


Instead of putting the IP Address of your instance directly in the DNS of your domain name, you put the IP Address of the Load Balancer in.

A Load Balancer can do much more, you can have different Rules, SSL tunneling, etc. You can read more about that in the online documentation.

Hopefully, now you know how to set up a second compute instance and you have an idea what a Load Balancer can do for you.

We are almost done with this series... but you definitely want to read the next blog post, which is the last one where I give some important information to keep your Always Free instance running.
Categories: Development

Free Oracle Cloud: 11. Sending Emails with APEX_MAIL on ATP

Dimitri Gielis - Fri, 2019-10-04 08:38
This post is part of a series of blog posts on the Best and Cheapest Oracle APEX hosting: Free Oracle Cloud.

In this post, we will configure the Oracle Cloud to support our instances, databases and Oracle APEX to send out emails. In my blog post 5. Setup APEX in ATP and create first APEX app, I initially said you can't use APEX_MAIL in APEX in ATP, but I was wrong, so I few days after my post I updated it, to point you to the documentation with the configuration steps you have to do to make it work.

The reason I thought you can't use APEX_MAIL was that during my tests, sending emails failed. I hadn't read the documentation ;) In this post, I will share how I got it to work after all.

The first thing you have to do is create an SMTP Credential for your user. You do that by logging into your Oracle Cloud account, go to Identity > Users and select your user:



Click the SMTP Credentials in the left menu:


Hit the Generate SMTP Credentials button, and give it a name:


On the next screen, you will see the USERNAME and PASSWORD. Take a note of this as you won't get to see it anymore after:


You will come back to the overview screen, but again, as far as I can see there's no way to get the password again, so if you lose it, you need to set up one again:


Now we will let Oracle APEX know about those parameters. Login as ADMIN through SQL Developer for example (see this blog post on how to do that) and run the following statement:

BEGIN
  APEX_INSTANCE_ADMIN.SET_PARAMETER('SMTP_HOST_ADDRESS', 'smtp.us-ashburn-1.oraclecloud.com');
  APEX_INSTANCE_ADMIN.SET_PARAMETER('SMTP_USERNAME', 'ocid1.user.oc1..xxxxxxxxxxx@ocid1.tenancy.oc1..xxxxxxx');
  APEX_INSTANCE_ADMIN.SET_PARAMETER('SMTP_PASSWORD', 'Fxxxx');
  COMMIT;
END;
/


Here is a screenshot when running:



Log in to Oracle APEX, go to SQL Workshop and try to send out emails:


It says statement processed, but when you query the APEX_MAIL_QUEUE, you will see it's still stuck with an error ORA-29278: SMTP transient error: 471 Authorization failed:


There's one more step you have to do, specify the email addresses you want to allow to send emails from. Go in your Oracle Cloud Account Dashboard to Email Delivery and click the Email Approved Senders and hit the Create Approved Sender button




Add the email address you want to send emails from and hit the Create Approved Sender button:


In the overview you will see all allowed email addresses:


When we try to send again and check the APEX_MAIL_LOG, we see the emails are effectively sent:


That's it, you can now send emails out of your APEX apps :)

We are almost done with the series. In the next post we will create a second compute instance and set up a Load Balancer.
Categories: Development

Free Oracle Cloud: 10. Running SQLcl and Datapump from the Compute Instance (VM) to ATP

Dimitri Gielis - Tue, 2019-10-01 05:30
This post is part of a series of blog posts on the Best and Cheapest Oracle APEX hosting: Free Oracle Cloud.

In this post, we will install and use SQLcl and Datapump from the Compute Instance (VM) connecting to our Oracle Database in the Autonomous Transaction Processing (ATP) Cloud.
Although I use most of the time SQL Developer to connect to the database, I find it important to be able to use command-line tools too, as this is what you can automate and it's really fast.
In the previous post, we installed the command line tools of the Oracle Cloud on our own machine, but for the Oracle Tools, I prefer to install them on our Compute Instance in the Cloud. Especially when we want to automate something, it's easier to do this from another machine in the cloud. It also makes it easier to follow as we only have to focus on how to install the Oracle Tools on Linux.

Oracle Instant Client

In order to connect to an Oracle database from a machine, we will use the Oracle Instant Client software. You can download the software for the different operating systems, but as our VM is running Oracle Linux we can install it with just a few commands:

First, update yum so it's smarter where to find Oracle software:

yum install oracle-release-el7

Next, we can search for the Oracle Instant Client version we need:

yum search oracle-instant


We want to install the Oracle Instant Client version of the system we want to connect to. For the Free Oracle Database on ATP, it's Oracle Database Release 18.4, so we will pick Oracle Instant Client 18.5. To be honest, typically I take the latest version of the software, but when I tried that, the Oracle Instant Client complained the libraries were not compatible with the version we wanted to connect to. I always thought you could use newer versions of the Oracle tools against previous databases, but apparently, that is no longer the case (at least not during my tests). Anyway, it's good to have the version of the same tool as the version you connect to.

Install the Instant Client basic and tools packages:

yum install oracle-instantclient18.5-basic.x86_64
yum install oracle-instantclient18.5-tools.x86_64


As a last step we set some environment variables:

export PATH=/usr/lib/oracle/18.5/client64/bin:$PATH
export LD_LIBRARY_PATH=/usr/lib/oracle/18.5/client64/lib
export TNS_ADMIN=/usr/lib/oracle/18.5/client64/lib/network/admin

That's it! We can now use the Oracle tools. Note there's also a SQL Plus package, which allows you to connect from a command line to the database, but I prefer to use SQLcl as it has some cool features for Oracle APEX (e.g. exporting your app). Download SQLcl now.

Before we move on to installing SQLcl, make sure you still have the credentials (wallet) file we used when connecting with SQL Developer to our database. Just like with SQL Developer, we also need this with SQLcl to connect to our database. As a reminder here's the screenshot I'm talking about:


Upload both the SQLcl and Credentials zip file to the Compute Instance (VM):

scp -i .ssh/oraclecloud /Users/dgielis/Downloads/wallet_DBDIMI.zip opc@132.145.215.55:/tmp

scp -i .ssh/oraclecloud /Users/dgielis/Downloads/sqlcl-19.2.1.206.1649.zip opc@132.145.215.55:/tmp


Connect to your VM and unzip the files:

ssh -i .ssh/oraclecloud opc@132.145.215.55

unzip /tmp/wallet_DBDIMI.zip -d /usr/lib/oracle/18.5/client64/lib/network/admin

unzip /tmp/sqlcl-19.2.1.206.1649.zip -d /opt


Before we can run SQLcl we also need to make sure we have JAVA installed, as SQLcl depends on that:

yum install java

To make it easier to run SQLcl from anywhere we will create a symbolic link:

ln -s /opt/sqlcl/bin/sql /usr/lib/oracle/18.5/client64/bin/sql

Now we are ready to connect to our database on ATP:

sql admin@dbdimi_high


There we go... we can connect from our VM to our ATP database.

The next thing we want to do is export the data from our ATP database. We will use Datapump that came with the installation of the tools.

Run the command to export the schema CLOUD:

expdp admin@dbdimi_high \
exclude=index,cluster,indextype,materialized_view,materialized_view_log,materialized_zonemap,db_link \
data_options=group_partition_table_data  \
parallel=1 \
schemas=cloud \
dumpfile=export%u.dmp


So where did this export go? To the default DATA_PUMP_DIR directory we don't have direct access to... but to list the files in the directory we can do:

SELECT * FROM DBMS_CLOUD.LIST_FILES('DATA_PUMP_DIR');


Remember my previous blog post about the Object Storage, in which we set up a Backups bucket?
Oracle allows you to connect your Object Storage to your ATP database and that is exactly what we will do further on :)

We will use the same user we created earlier for CLI. In order to connect to ATP we need to set up an Auth Token. Go to the User Details of cliUser and click the Auth Tokens:


Click the Generate Token button:


There's the token... you only see it once, so make sure to copy it:


Next, connect to your ATP database and run the script to add the credentials to the ATP database:

begin
  dbms_cloud.create_credential(
    credential_name => 'DEF_CRED_NAME'
    , username => 'cliUser'
    , password => 'Frx}R9lD0O}dIgZRGs{:'
  );
end;
/


Now that the DBMS_CLOUD package has credentials, we can do other calls with this package.
To add the Datapump export files to the Object Storage, we can use the PUT_OBJECT procedure.

I created a small script to take all the files from the DATA_PUMP_DIR and put them in the backups Bucket in the Object Storage:

begin
  for r in (select object_name, bytes
              from dbms_cloud.list_files('DATA_PUMP_DIR'))
  loop
   dbms_cloud.put_object(credential_name => 'DEF_CRED_NAME',
     object_uri => 'https://objectstorage.us-ashburn-1.oraclecloud.com/n/id9u4qbhnjxj/b/backups/o/'||r.object_name,
     directory_name => 'DATA_PUMP_DIR',
     file_name => r.object_name);
  end loop;     
end;
/

And when we check our bucket, we see the Datapump export files! Yay!


We also want to export our Oracle APEX apps. In some projects, I use the APEXExport utility, but now we will use SQLcl to export our APEX app 101:

apex export 101

In real life I typically create a few scripts which I can run one-by-one or combined in a general backup script. The script will export the Oracle schemas, the APEX apps and save the files to another location, in our case the Object Storage.

vi make_backup.sh


 Here are the details of the scripts which are called in the main backup script:


You can schedule this script with crontab, for example, every day at 2AM:


The above is just an example of what you can do to automate your backups. You have to decide how frequently you want to do those backups.


If you want to move your existing Oracle database and APEX apps to the Oracle Cloud, the steps are similar to above. You upload your Datapump export file to your Object Storage. Next, run the Data Pump Import with the dump file parameter set to the list of file URLs on your Cloud Object Storage and the credential parameter set to the name of the credential you created earlier. For example:

impdp admin/password@dbdimi_high \  
     directory=data_pump_dir \  
     credential=def_cred_name \
     dumpfile= https://objectstorage.us-ashburn-1.oraclecloud.com/n/adwc/b/adwc_user/o/export%u.dmp \
     parallel=1 \
     partition_options=merge \
     transform=segment_attributes:n \
     transform=dwcs_cvt_iots:y transform=constraint_use_default_index:y \
exclude=index,cluster,indextype,materialized_view,materialized_view_log,materialized_zonemap,db_link

Next, you would create your APEX workspace and import the APEX apps.


In the next post, we dive again in Oracle APEX and how to send emails from your APEX app.

Categories: Development

Free Oracle Cloud: 9. Setup Object Storage and use for File Share and Backups

Dimitri Gielis - Sat, 2019-09-28 11:22
This post is part of a series of blog posts on the Best and Cheapest Oracle APEX hosting: Free Oracle Cloud.

In this post, we will look into how we can store and share files, with ourselves, with others, but also with our ATP Database, we set up earlier. The Oracle Cloud provides for this purpose Object Storage, and we get 20GB for free forever. This storage is also being used when you want to make backups of the database, and use Datapump to export and import data. In the previous blog post, when you followed the installation of the on-premises version of AOP, you actually connected to my Object Storage as the AOP zip file is on it.

Now we know the purpose of this Object Storage, let's get started to set it up.

Log in to the Oracle Cloud and navigate in the menu to Object Storage:

You arrive at the screen where you have to pick a compartment in the dropdown on the left. Compartments are used to organize your resources.


After selecting my compartment "dimi (root)", we get an overview of Buckets and we can create a new one by clicking on the Create Bucket button:


In the Create Bucket screen, you enter a name and the type of storage you want to add in that bucket.
First, we create an archive bucket, in which to store backups of our database:


In the next screen we create a standard bucket to share files with others:


We can now see we have two buckets: apexofficeprint (standad storage) and backups (archive storage). Note the 3 dots on the right of the bucket, which give you the ability to view details or perform some other actions:


As I wanted to share the AOP zip file in my object storage, in the menu I click the Edit Visibility link to make the bucket accessible for everybody (public).


Next, we will add a file to the bucket. Click the View Bucket Details:


We get the details of the Bucket and see an overview of the Objects in this Bucket:


Click the Upload Objects button:


Drag a file in the Drop files area and hit the Upload Objects button:


We arrive back in the overview screen:


Just like in the Buckets overview, in the Objects Overview next to the objects you have the 3 dots on the right to perform actions on the object:


Click the View Object Details and you find next to some info, the URL where your object is accessible from:


So to conclude, an Object Storage exists out of Buckets which live in a certain Compartment and a Bucket exists out of Objects. (Object Storage => Compartments => Buckets => Objects)


Above we used the Oracle Cloud website to work with our Object Storage, but I'm also interested to do this from the command line. For example, we automated our build process when we make a new release of APEX Office Print. In the final step, we want to upload the new zip file to the cloud. When we can do everything from the command line, we can script and automate it.


You can follow the documentation on how to install the Command Line Interface (CLI).

I did the following on OSX (from Terminal) to install CLI:

bash -c "$(curl -L https://raw.githubusercontent.com/oracle/oci-cli/master/scripts/install/install.sh)"

The install script asked a few questions (where to install etc.) but I used all defaults and at the end, all was installed fine and I got the following screen:


At the end I moved everything from the bin directory in my default directory, so it's easier to call the CLI:

mv bin/* .

Next, configure CLI. We have to let the CLI know who we are. This is done by creating a config file and specifying the user CLI can connect as. Here we go:

./oci setup keys


Go to the Oracle Cloud website and set up a User based on the keys you created with CLI.

Go to Identity - Users in the menu and click the Create User button:


Enter a new user, as I wanted to create a user-specific for CLI I called it cliUser. Note that the email address you provide needs to be unique. I initially used the same email, but that didn't work.


Now we will need to add the key to this user. Click the View User Details button in the action menu of the user:


Click the Add Public Key button:


And copy the content of the ~/.oci/oci_api_key_public.pem file:


Next, we want to give this user Administrator privileges. To do so, go to Groups:


Click the Add User to Group button:


Select Administrators and hit the Add button:


Now, this user is all set to work with the Command Line Interface (CLI):


We will now configure the CLI to connect as the cliUser we just created.
In your terminal run in your home directory:

./oci setup config


The wizard asks for some OCIDs. Below I walk you through where to find them.

The user OCID, go to Identity > Users and the details of the user. Next to OCID click the Copy link:


The tenancy OCID you find in Administration > Tenancy Details:





















The location you find when you scroll down in the Regions section of the Tenancy screen. Yours will be highlighted in green.


The final question of the wizard is if you want to create a new RSA key, answer No and point to the file (and not the directory like I first did). This is how it looks like after answering the questions:


Once the wizard is complete, you are all set. You can view the config file by doing:

cat .oci/config


Next, we want to create a Bucket in a Compartment and add a file through the CLI to that bucket.

Before we can run the command, we have to know the OCI of the Compartment. Here're the steps to get to that. Identity > Compartments:


 In the Details you find the OCI:


Now that we have everything we need, we can run the command to create a bucket called clibucket:

./oci os bucket create -c ocid1.tenancy.oc1..aaaaaaaakmf6mlauyaqmkkcikiuu2ckmklhffxf2weheu3qtfnsvcuzfuiuq --name clibucket


On success, we get a JSON back with the details of the bucket. If it errors, you will get an error back with details in JSON format.

Just to make sure the bucket is there, go to the Oracle Cloud website and check if you see the bucket we created with CLI:


To add an object to the bucket, you can do:

./oci os object put -bn clibucket --file test1.txt


Other commands I use more frequently:

# Get a list of the objects:
./oci os object list -bn clibucket

# Download a file called test1.txt and save it on your system as test2.txt
./oci os object get -bn clibucket --file test2.txt --name test1.txt

A useful resource to know which commands the CLI understands is in this documentation.
I did the above as an administrator, but if you want to read more about restricted use, you can read about that in Lawrence Gabriel's blog post.

I believe now you have a good understanding of the Object Storage and how to work with it through the website or through CLI. One of the reasons I mentioned was to use the Object Storage as a place for backups... let's look into that now.

When we go to our Autonomous Database, select your database and go into details. In the Resources section you find a link Backups:


Normally backups are automatically taken in ATP, but you can also create a manual backup:


When you create a manual backup, you have to specify the Object Storage bucket you want the backup to be saved in. But, when I tried to create a manual backup it told me that this option is not available in the Always Free Tier. Also, it says that restore is not possible, so not sure what happens when you want to restore an automated taken backup... for now, I'm not relying on those backups, in my next blog post I will tell you what I do for backups.


In the next blog post of this series, I will walk you how to use the Object Storage with the ATP Oracle Database to export and import data.
Categories: Development

Free Oracle Cloud: 8. Setup APEX Office Print (AOP) to export to PDF, Excel, Word, Powerpoint, HTML and Text

Dimitri Gielis - Wed, 2019-09-25 06:00
This post is part of a series of blog posts on the Best and Cheapest Oracle APEX hosting: Free Oracle Cloud.

In the previous posts we setup our Always Free Oracle Cloud machine and an Autonomous Database with Oracle Application Express (APEX). In this post I want to show you how to get started with the popular printing and reporting engine, APEX Office Print (AOP). The AOP software makes it super easy to export your data into a nice looking PDF, a custom Excel file, a fancy Powerpoint or other output formats of your choice, just the way you want it.
AOP is being used by many customers, even Oracle internally, to export their data in the format they want. The data can come from the database, a REST or GraphQL web service, or even components like the Interactive Report/Grid from Oracle APEX. Although AOP works with any technology, it is most known in the Oracle APEX community as it's the easiest and most integrated print engine for Oracle APEX. You create a template in DOCX, XLSX, PPTX, HTML or TEXT, specify the data source, and tell AOP in which format you want the output to be (PDF, Excel, Word, Powerpoint, HTML, Text) and AOP will do the rest! You can find more information in this presentation about AOP.

Christina Moore of Storm Petrel wrote me a few days ago following: "We have a client in one of our systems who generates a 1,888 page invoice monthly (about 2,000 pages). The most recent invoice was $1.3M USD and took 384MB. AOP handles it brilliantly. Well done. I can’t email it to you for confidentiality reasons, but know it has multiple sections that are merged with your tool too." I love feedback on the usage of AOP and am amazed how creative people are when developing with AOP!

I use AOP in every project because exporting/printing is a requirement sooner or later and an essential part of my Oracle APEX apps. So I thought to write how to use this in the Oracle Cloud :)

We have two options: we let our Oracle Autonomous Database and APEX talk to the AOP Cloud or we install an on-premises version of AOP on our own Compute VM. 

Ok, so lets get started...  open a browser and go to https://www.apexofficeprint.com and click the SIGN UP button:


Enter your email and hit Signup:


You will receive an email. Push the Confirm your email address button:


The browser will open where you can set a password for your account:


After hitting the Set Password button, you are logged in automatically and will see a Getting Started wizard:


Follow the wizard and you are all set! It should take less than 15 minutes :)

In short this is what the wizard will tell you:

  1. Download the AOP software and unzip the file
  2. Go to APEX > SQL Workshop > SQL Scripts > Upload and Run the file aop_db_pkg.sql which you find in the db folder. This will install the AOP PL/SQL API.
  3. Go to APEX > Your APP > Shared Components > Plug-ins and Import the APEX Plug-ins you find in the apex folder.
  4. Go to APEX > Your APP > Shared Components > Component Settings > APEX Office Print (AOP) and enter your API Key which you find in the Dashboard on the AOP site: 



The Component Settings in your APEX app:


When you look closely at the previous screenshot of the Component Settings, look at the AOP URL.
The URL specifies where the AOP Server is running, which the AOP APEX Plug-in and AOP PL/SQL API communicate with. By default this is set to the AOP Cloud, so you don't have to setup an AOP Server in your own environment.

Although the AOP Cloud is really convenient as it's maintained and support by the APEX Office Print team, some customers prefer to run the AOP Server on their own machine, especially when data can't leave the datacenter.

So if you read on, I will walk you through Setting up the AOP Server on your own Compute VM in the Oracle Cloud.  Just be sure you have already installed the AOP Sample Application, plug-ins, and Database Objects, if needed, as instructed in the Getting Started section, above.

From a Terminal connect to your Oracle Cloud VM:

ssh -i ssh_key opc@public_ip

The first thing we do, is change to the root user, as we want to install some supporting objects for AOP it will be easier to do it with the root user. Alternatively in front of every command you can add sudo.

We logged in as the OPC user, to become the ROOT user we do:

sudo su

Unlike other reporting engines, AOP software exists only out of couple of files and is installed in no time. We will download the software in the tmp folder on our machine and unpack it in /opt/aop:

cd /tmp

wget https://objectstorage.us-ashburn-1.oraclecloud.com/n/id9u4qbhnjxj/b/apexofficeprint/o/aop_free_oracle_cloud.zip

unzip aop_free_oracle_cloud.zip -d /opt/aop

That's it!! The AOP Server is installed!



To support PDF output, AOP relies on a 3rd party converter like MS Office or LibreOffice. Here are the steps to install LibreOffice:

yum install java-1.8.0-openjdk.x86_64

yum install cups.x86_64

wget http://ftp.rz.tu-bs.de/pub/mirror/tdf/tdf-pub/libreoffice/stable/6.2.7/rpm/x86_64/LibreOffice_6.2.7_Linux_x86-64_rpm.tar.gz

tar -xvf LibreOffice_6.2.7_Linux_x86-64_rpm.tar.gz

cd /tmp/LibreOffice_6.2.7.1_Linux_x86-64_rpm/RPMS/

yum localinstall *.rpm

ln -s /opt/libreoffice6.2/program/soffice /usr/sbin/soffice

LibreOffice is installed. To see if everything is fine you can run "soffice --version" and you should see something like this:



AOP comes with a built-in web server. When you start AOP you can define the port where AOP will listen to incoming requests. The default port is 8010. We will need to tell Linux this port can handle HTTP and HTTPS requests.

semanage port -a -t http_port_t  -p tcp 8010

To start AOP on the default port do:

cd /
./opt/aop/v19.2.3/server/APEXOfficePrintLinux64 --enable_printlog &

You should see something like this:



Yay!! AOP is running.

AOP comes with a cool Web Editor, we will make this Editor available on our domain dgielis.com/aop/. In order to do that, we will adapt Nginx to also be a reverse proxy for the AOP Web Editor. 
Here we go; 

vi  /etc/nginx/conf.d/dgielis.com.conf

And add following section:

  location /aop/ {
    proxy_pass http://127.0.0.1:8010/;
  }

The server part of the config file becomes:



We need to reload Nginx:

nginx -s reload

And now when we go in a browser to dgielis.com/aop/ we see the AOP Web Editor:



You can now, for example, load a sample by clicking the "Load sample" button and select PDF.
Scroll down a bit lower and click the Process button and a PDF is being generated :)



The Web Editor is built in React.js and you can drag-drop your template and add some data to test the features of AOP. There's also a Logging tab (toggle between Editor and Logging), so you can see incoming requests, results and debug output in case of errors.


Now if we want to tell our Oracle APEX apps to use our own AOP Server, the only thing we have to do is change the AOP URL.

In your Oracle APEX app, go to Shared Components > Component Settings > APEX Office Print (AOP) and change the AOP URL to the URL of your own Compute VM:


That's it! You are all set to print and export data within your own environment :)

I would recommend looking at the AOP Sample App which you installed in the last step if you followed the Getting Started wizard. It will show over 500 examples how to use AOP and its features!

Now I hope you enough knowledge so that you can please your customers with nice looking PDF, Excels and other documents in the format they want.


In the next post we will add an Object Storage to our Always Free Oracle Cloud Plan so we have a place to store files and backups.

Categories: Development

Pages

Subscribe to Oracle FAQ aggregator - Development