Quantcast
Channel: Oracle Bloggers
Viewing all 19780 articles
Browse latest View live

JET Composite Components XIII - Patterns for Deferred and Asynchronous UI Loading

0
0

Introduction

This article outlines a series of simple approaches that you can use in JET Composite Components that do not have a fixed user interface. These might be UIs that are generated or altered dynamically based on the attributes defined for the Composite Component Tag, or maybe based on data that the Composite itself is retrieving from a remote service as it starts up. In the article I will outline three basic patterns in order of complexity:

  1. Pattern 1 - Simple Conditional and Looping based UIs
  2. Pattern 2 - Consolidated Template based UIs
  3. Pattern 3 - Separately Templated UIs
In all cases, the view that ends up being used for the Composite Component shares the same basic Component viewModel. All we are doing here is making the UI part a little more dynamic. These approaches can be useful when you want to display alternate views based on data that you simply don't have when the Composite Component is defined. For example, you may need the view to be sensitive to the runtime user role, the particular shape of a collection being displayed or even the type of device being used. Note that the patterns explained here can actually be blended together in various ways as well.

Pattern 1 - Simple Conditional and Looping based UIs

The pattern 1 approach simply leverages the power of basic Knockout syntax to dynamically vary the UI. Specifically you can use conditional and looping constructs such as if, visible and forEach within the basic view that you define for the Composite Component. The framework will automatically resolve these as the Composite Component has it's bindings applied. As a simple example to illustrate this, imagine that I have a component where I want to support two display modes for the consumer to be able to select from compact or full. To do this I could expose a compactView boolean property in my Composite Component metadata:

    {
      "properties": {
        …,
        "compactView": {
          "description" : "Hides the First-Name if set to TRUE, defaults to FALSE",
          "type": "boolean",
          "value" : false
        }
      },
      …
    }
    
Then the consumer can set the value to true if needed, thus:
<ccdemo-name-badge compact-view="true"  badge-name="..."/>
    
Then in my Composite Component view definition, ccdemo-name-badge.html I could use a Knockout ifnot test to only display the first name data if the mode is not compact:
<div class="badge-face">
      <img class="badge-image" data-bind="attr:{src: $props['badge-image'], alt: $props['badge-name']}"/>
      <!-- ko ifnot : $props.compactView -->
      <h2 data-bind="text: upperFirstName"/>
      <!-- /ko -->
      <h3 data-bind="text: $props['badge-name']"/>
    </div>
    

Pattern 2 - Consolidated Template Based UIs

Leading on from the basic use of Knockout if, foreach etc. in the view HTML, we can take the logical next step of using the full Knockout template mechanism. The simplest version of this would be to define the alternative view templates inline in the base view HTML for the Composite Component. To show this, I'll use the same Composite Component compactView boolean property as before. The big alteration is in the Composite Component view definition (ccdemo-name-badge.html):

<div class="badge-face" data-bind="template: { name: viewModeTemplate}"/>

    <!-- Templates follow in-line -->

    <script type="text/html" id="compactTemplate">
      <img class="badge-image" data-bind="attr:{src: $props['badge-image'], alt: $props['badge-name']}"/>
      <h3 data-bind="text: $props['badge-name']"/>
    </script>

    <script type="text/html" id="fullTemplate">
      <img class="badge-image" data-bind="attr:{src: $props['badge-image'], alt: $props['badge-name']}"/>
      <h2 data-bind="text: upperFirstName"/>
      <h3 data-bind="text: $props['badge-name']"/>
    </script>
    
In this version of the HTML you can see that the main bulk of the component markup has been removed from the outer <div> and instead it has gained a template data-binding that uses a viewModel value called viewModeTemplate.
Additionally, the HTML has gained two scripts of type html/text called compactTemplate and fullTemplate respectively (based on their id attribute). These two scripts1 provide two alternative user interfaces that can be substituted into the main <div>.

The final ingredient to make this pattern work is the implementation of the viewModeTemplate property in the Composite Component viewModel. The Knockout template evaluation will expect this to contain a string which matches one of the available templates (e.g. "compactTemplate" or "fullTemplate"). In my example I'm triggering the change based on a boolean attribute called compactView. So in the property resolution for the Composite Component I can add a little logic to inspect that boolean value and store the appropriate template name into a property called viewModelTemplate. Here's the property resolution block2 with this added.

…
    function CCDemoNameBadgeComponentModel(context) {
     var self = this;
     context.props.then(function(propertyMap){
         //Save the resolved properties for later access
         self.properties = propertyMap;

         //Extract the badge-name value
         var badgeNameAttr = propertyMap['badge-name'];
         self._extractFirstName(badgeNameAttr);

         //New code to select the correct template to use
         var compactMode = propertyMap.compactView;
         if (compactMode){
             self.viewModeTemplate = 'compactTemplate';
         }
         else {
             self.viewModeTemplate = 'fullTemplate';
         }
     });
    …

Pattern 3 - Separately Templated UIs

The final variation is naturally the most powerful but does involve a little more code. In this version, we'll still use the Knockout template mechanism, but rather than encoding the different template options into <script> tags within the view HTML we instead define totally separate HTML files for each UI variant desired. Using this approach we can actually remove the need for a placeholder HTML file and instead in-line that into the bootstrap loader.js. So in terms for files for our running sample composite component we might end up with:

    /ccdemo-name-badge
      loader.js
      ccdemo-name-badge.json
      ccdemo-name-badge.js
      ccdemo-name-badge.css
      ccdemo-name-badge-compact.html
      ccdemo-name-badge-full.html
    

Next we make a slight alteration in the boostrap component.js so as to not inject an initial HTML view via the RequireJS text plugin:

    define(
      ['ojs/ojcore', './ccdemo-name-badge','text!./ccdemo-name-badge.json',
       'css!./ccdemo-name-badge', 'ojs/ojcomposite'],
          function (oj, ComponentModel, metadata, css) {
              'use strict';
              oj.Composite.register('ccdemo-name-badge',
                  {
                      metadata: {inline: JSON.parse(metadata)},
                      viewModel: {inline: ComponentModel},
                      view: {inline: "<!-- ko template: {'nodes' : templateForView} --><!-- /ko -->"},
                      css: {inline: css}
                  });
          }
      );
    
Notice in the view property of the register() parameters I'm now injecting an HTML string directly and this encodes just a Knockout template reference. You'll also notice that this then supplies the nodes property of the template, not the name. (for more information about this see the Knockout doc). Injecting this HTML inline in this way simply removes the requirement to define a separate HTML file which would need to loaded in the define block as per the previous examples we've seen3.

Next we need to contrive how to get hold of the two possible template files within the Composite Component viewModel. The simplest approach here is to inject them through the define() block of the viewModel (although you could load them in other ways too, e.g. using require()). So amending our ccdemo-name-badge.js define block gives:

    define(
      ['ojs/ojcore','knockout','jquery',
       'text!./ccdemo-name-badge-compact.html',
       'text!./ccdemo-name-badge-full.html'
      ],
      function (oj, ko, $,compactTemplate,fullTemplate) {
      'use strict';
    …
Notice how the two HTML text streams are injected into the define function block as compactTemplate and fullTemplate respectively.

With this pattern, there is one more thing to do which is to set up the templateForView property that Knockout is expecting to contain the element subtree used to implement the template. We do this using the activated lifecycle method in the Composite Component. This will check our compactMode boolean property and then use the ko.utils.parseHtmlFragment() API to convert the correct template text stream into the subtree that Knockout expects:

    CCDemoNameBadgeComponentModel.prototype.activated = function(context) {
      if (this.properties.compactMode){
          this.templateForView = ko.utils.parseHtmlFragment(compactTemplate);
      }
      else {
          this.templateForView = ko.utils.parseHtmlFragment(fullTemplate);
      }
    };
    

There are many variations of this final pattern that you could use, including building a completely dynamic UI with no source HTML files on disk at all.

What's Next?

In the next article I take the idea of dynamic CCA content one step further by showing you how a Composite Component can use ojModule to present multiple views, each with their own viewModel.


CCA Series Index

  1. Introduction
  2. Your First Composite Component - A Tutorial
  3. Composite Conventions and Standards
  4. Attributes, Properties and Data
  5. Events
  6. Methods
  7. The Lifecycle
  8. Slotting Part 1
  9. Slotting Part 2
  10. Custom Property Parsing
  11. Metadata Extensibility
  12. Advanced Loader Scripts
  13. Deferred UI Loading

1 Of course you are not restricted to just two alternative templates here, you can use as many as you like.

2 This is all using the same sample that I've been working with throughout this series of articles. Jump back to Part II if you've not encountered it before.

3 This does not stop you from doing so, however, it just depends on how you want to organize your code.


We Love Hearing From You! Read Adrienne Roey's Story.

0
0

Ready to be inspired to begin your Oracle Certification path today?

If so, read Adrienne Roey's Oracle Certification success story, originally shared in the Oracle Certification Community.

I've specialized in data reporting and integration for several years.  When a colleague learned that I was studying for the Oracle Certified Associate SQL exam, he said, "Why?"  "You already know this (information)."  While I appreciated his confidence in my abilities  I knew that I was using only a small portion of the features available to me in the latest version of Oracle software.   Certifications are a way for me to stay current with the breadth and depth of the products. 

Share your own story or begin your certification path today! 

Windows 10-Related EBS Certifications: February 2017 Edition

0
0

Windows 10 logoE-Business Suite certifications with Microsoft Windows 10 have become hard to track.  This is partly due to the number of different things that run with Windows 10, including EBS components and browsers.  In addition, Microsoft is positioning Windows 10 as a "Windows as a service" offering, which has resulted in a series of recent changes to their release vehicles and offerings.

I've been covering these regularly. Here's a recap of everything related to EBS certifications on Windows 10 to date:

Windows 10, Java, and Edge

Microsoft Edge does not support plug-ins, so it cannot run Forms.  We are working on an enhancement request called “Java Web Start” to get around this Edge-limitation:


Real-Time Integration Business Insight: External Dashboard

0
0

clip_image002

This video demonstrates how to include external dashboards in Oracle Real-Time Integration Business Insight. Watch the video here.

SOA & BPM Partner Community

For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center.

BlogTwitterLinkedInimage[7][2][2][2]Facebookclip_image002[8][4][2][2][2]Wiki

JET Composite Components and Oracle Sites Cloud Service

0
0

In a slight departure from the ongoing series of articles on CCA created by myself, I wanted to link through to an ongoing and  related series of articles on using CCA within Oracle Sites Cloud Service. This is a great example on how CCA is converging on becoming a common standard for component creation throughout the Oracle + JavaScript universe.

Anyway for more information on CCA and Sites, head over to Igor Polyakovs blog on the subject:

To follow my CCA learning path, start at the beginning: JET Composite Components I - Backgrounder

 

Mobile, Cloud and AI...The (chat)Bots are here!

0
0


Mobile and Cloud are ubiquitous. Chances are, you have a mobile device within reach and in your daily life, you use a Cloud based service. Mobile is the “first screen." Cloud, more specifically, Cloud Native, when it comes to ease of provisioning, set up, use, flexibility, extensibility, security, maintenance – and building on the growing trends in API first, Microservices, and Functions, the Cloud technology and economic model is an easy call. 



Mobile and Cloud have come together to change the landscape


So what’s next? While mobile is the "first screen," messaging apps are dominant apps used on those screens, and Artificial Intelligence, e.g. intelligent chatbots, is the new interfaces – or maybe even no screen at all, but instead, voice interfaces. Hey Siri, who is Alan TuringJohn McCarthy?


Alan Turing asked“Can machines think?” in his 1950  paper, Computing Machinery and Intelligence

John McCarthy, coined Artificial Intelligence at the first AI research conference at Dartmouth College

Artificial Intelligence or simply “AI” is the next big shift in technology. I'll tread carefully because historically, AI’s been emerging since the fifties but the compute power wasn’t yet there and the science needed to mature through a few “AI winters." Now thanks to many brilliant minds and exponential growth in compute power (Moore’s law), this time, it's the dawn of the AI era. Serious breakthroughs have occurred and the time between AI milestones is shrinking.

A few examples:

1997, Deep Blue defeats Garry Kasparov, grandmaster World Chess Champion.  (Ok - an 8x8 grid. Not surprising. What about Go? The complexity is magnitudes greater.2016 AlphaGo’s Deep Mind beats Lee Sedol, the world’s grandmaster Go champion.  A decisive win with creative moves Mr. Sedol and the Go community had never seen. 


But Chess and Go are games with “perfect information.” What about imperfect games where some information is hidden? Like poker. Surely, human intuition has an edge?


 a

Jan 2017, Carnagie Mellon’s Libratus poker bot beats 4 of the world’s best professional poker players, heads up, no-limit Texas Holdem. The win was decisive.

Games with imperfect information... Ouch.


The milestones are impressive, but I think we have some time before singularity and I'm not ready to be a house cat to the AI overlords. The Luddite option isn't right either. Let’s look near term and narrow these topics to the business of the enterprise where we will all need to step up our game!


Messaging Intelligent Chatbots


Messaging apps are the most common way for people to communicate with each other, and it is becoming more common (and preferable) to communicating with businesses. Messaging is the latest interface to reach everyone because it’s natural. Mobile apps are great, but the mobile home screen can only fit so many apps. Since messaging apps are popular (WhatsApp, WeChat, Kik, FB Messenger, etc.) automated, intelligent chatbots, offer a way to establishing a scalable, natural conversation between humans and businesses, while at the same time, increasing quality, consistency and response times. Chatbots are not at the stage of passing the Turing test, and replace the ability for humans to work on complex problems, but instead, offload simple to moderately complex inquiries automatically – freeing expert staff to work on complex questions.  


Without a doubt, Mobile, Cloud and AI/chatbots are in your future. Whether you will be driving or following is up to you. You’ll want an open, flexible environment that is secure, scalable and intelligent, from a trusted & proven partner.  Oracle is the leader in SaaS, PaaS and IaaS,  and is “all in.”


To learn more about Oracle's mobile, cloud and AI/chatbot development:


  • Oracle Code: A 20 cities, worldwide tour built for developers. Get hands on experience with Oracle’s Cloud Native, Mobile and Open Source adoptions. Technical presentations and hands on labs. And please visit developer.oracle.com

  • Larry Ellison introducing intelligent chatbots at the 2016 Oracle OpenWorld keynote and stay tuned for more exciting announcements.
    • Quickly! Before our AI overlords take over, follow us @OracleMobile and to join the Oracle Mobile conversation on LinkedIn.

    Automating DevOps for the Oracle Database with Developer Cloud Service and SQLcl

    0
    0

    In the previous blog entry I showed how you can leverage Oracle Developer Cloud Service (DevCS) to manage the lifecycle of your database creation scripts (version management, branching, code reviews etc).

    But how do you make sure that your actual database is in synch with the changes that you make in your scripts?

    This is another place where DevCS can come to the rescue with the built-in continuous integration functionality it provides. Specifically with the new features for database integration including secure DB connection specification, and leveraging the powerful SQL Command Line (SQLcl) - the new command line interface to the Oracle DB - which is built-in in the DevCS build servers.

    In the video below I go through a process where a check-in of SQL script code change automatically initiate a build process that modifies a running database.

    A few points to note:

    • For the sake of simplicity, the demo doesn't follow the recommended step of a code review before merging changes into the master branch (you can see how to do that here).
    • The demo automates running the build whenever a change to the scripts is done. You could also define a scenario where the build runs at a specific time every day - for example at 1am - and synch the DB to today's scripts.
    • You can further extend the scenario shown here of dropping and re-creating objects to add steps to populate the DB with new data and even run tests on the new database.

     

    As you can see Developer Cloud Service can be a very powerful engine for your database DevOps - and it is included for free with your Oracle Database Cloud Services - so give it a try

    Announcing Software Collections 2.3

    0
    0

    We are pleased to announce the release of Software Collections 2.3 to the Unbreakable Linux Network and the Oracle Yum Server.

    The Software Collections library allows you to install and use several different versions of the same software at the same time on a system.  Software collections are primarily intended for development environments.  These environments often require more recent versions of software components such as Perl, PHP, or Python to access to the latest features.  However, they also need to avoid the risk of disrupting processes on the system that rely on different versions of these components.

    You use the software collection library utility (scl) to run the developer tools from the software collections that you have installed. The scl utility isolates the effects of running these tools from other versions of the same software utilities that you have installed.

    New Software Collections for Oracle Linux 7

    The following collections have been added to the Software Collections for Oracle Linux 7

    • devtoolset-6
    • rh-eclipse46
    • rh-git29
    • rh-redis32
    • rh-ruby22
    • rh-perl524
    • rh-php70

    New Software Collections for Oracle Linux 6

    The following collections have been added to the Software Collections on Oracle Linux 6

    • devtoolset-6
    • rh-git29
    • rh-redis32
    • rh-perl524
    • rh-php70
    • rh-python35
    • rh-maven33
    • rh-ror42
    • rh-ruby22
    • rh-ruby23

     

    How to use Software Collections on Oracle Linux

    The Software Collections Library 2.3 for Oracle Linux guide covers installation and use of all the Software Collections on Oracle Linux 6 and 7.

    Support

    Support for Software Collections is provided at no extra cost to customers with an Oracle Linux Premier Support subscription.

    If you do not have paid support, you can get peer support via the Oracle Community forums at https://community.oracle.com.


    Big Data SQL Quick Start. Oracle Text Integration – Part 18

    0
    0
    Today, we’ll focus on the integration of Oracle Text, the Full-Text Indexing capabilities from the Oracle database with documents residing on HDFS. Oracle Text has been available since years and has evolved to address today’s needs regarding indexing: 150+ document formats supported... [Read More]

    Big Data SQL Quick Start. Machine Learning and Big Data SQL – Part 19

    0
    0
    It's very frequent case when somebody talks about Big Data he or she also wants to know how to apply Machine Learning algorithms over this data sets. Oracle Big Data SQL provides the easy and seamless way to do this. Secret of this in Oracle Advanced Analytics (OAA) option, which has been existing for many... [Read More]

    How Dev/Test in the cloud is accelerating delivery of Oracle Middleware projects by Matt Wright

    0
    0

    clip_image002

    The source of competitive advantage and value that an organization delivers to its end customers is increasingly defined by the software “systems” that underpin them. As a result, organizations find themselves in a digital race, where the speed at which IT can reliably deliver new features and innovations is what sets them apart from their competition.

    In an industrial company, avoid software at your own peril . . . a software company could disintermediate GE someday, and we’re better off being paranoid about that.”

    Jeff Immelt, CEO, General Electric

    These innovations are seldom delivered by pre-packaged business applications, whether running on-premise or delivered as Software as a Service (SaaS), but by custom solutions derived in-house. Yet, most organizations have neither the time nor funds to build these systems of innovation from the ground up. Instead, they are delivered by layering new capabilities on top of existing applications, an approach defined by Gartner as “Pace-Layered”.

    Oracle Middleware, such as the Oracle BPM Suite and Oracle SOA Suite provides the application glue to rapidly and continually combine these business apps, like puzzle pieces, into a custom integrated solution in order to deliver a seamless and unified experience to the customer.

    Yet even with this Pace-Layered approach, many IT projects are still failing to deliver either on-time or on-budget, with development teams often held back by their own IT organization. So how can you reduce the cost of the software you develop and decrease the time it takes to get it right?

    Research shows moving development to the cloud can initially reduce development time by an order of 11 to 20 percent. Organizations that fully embrace the cloud for Dev and Test are experiencing 30%+ time savings upon maturing their DevOps capabilities. Read the complete article here.

    SOA & BPM Partner Community

    For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center.

    BlogTwitterLinkedInimage[7][2][2][2]Facebookclip_image002[8][4][2][2][2]Wiki

    Converting ADLs to implement end to end JSON in SOA Suite 12.2.1 -PART I by Luis Augusto Weir

    0
    0

    clip_image002

    There is no doubt that web [Rest] APIs have become extremely popular and its usage has gone well beyond just building APIs in support of mobile apps. We can see the adoption of resource-oriented architectures (ROA) by probably all SaaS vendors who provide out-of-the-box APIs as the means to connect and interact with their cloud applications. Take for example the Oracle Cloud. To discover and consume publicly available Oracle SaaS APIs, all one need to do is browse the Oracle API Catalog Cloud Service (which is publicly accessible) and just select the Swagger definition for any given API.

    But (as you probably already know) the adoption of web APIs hasn't stopped there.  With the increased popularity of Microservice Architectures , initiatives such as Open Legacy ,  and node.js based frameworks like loopback and sails (to name a few), API-enabling system of records is becoming a lot easier.
    This is putting a lot of pressure in software vendors to quickly modernise their integration suites to natively support the technology-stacks and patterns prevalent in these type of architectures. For example, if an organisations mobile application needs to interact with a system of record (on premise or the cloud) that already exposes a web API, the integration stack should be capable of supporting JSON over HTTP end-to-end without having to convert to XML back and forth. Not only is this impractical but introduces more processing burden to the core stack...
    Luckily for many Oracle's customers and Oracle Fusion Middleware / Oracle PaaS practitioners like myself, with the latest release of Oracle SOA Suite (12.2.1) , one of the many new features introduced is the support for handing JSON end-to-end.  I don't want to understate the importance of this as with such feature it is possible to use BPEL for example to orchestrate several APIs (all in native JSON and also in-memory with the new SOA in-memory feature) and therefore deliver coarse grained business APIs that actually perform.
    For me this represents an important milestone for Oracle SOA Suite as it shows the departure from traditional SOA tech-stack and into SOA 2.0 (as I like to call it) as the suite is now better suited to support the adoption of ROA, microservices, IoT, and so on. Having worked with SOA Suite since 10.1.3.1 this is very exiting. Read the complete article here.

    SOA & BPM Partner Community

    For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center.

    BlogTwitterLinkedInimage[7][2][2][2]Facebookclip_image002[8][4][2][2][2]Wiki

    Webcast: "EBS Technology: Latest Features and Roadmap"

    0
    0

    ATG RoadmapOracle University has a wealth of free webcasts for Oracle E-Business Suite.  If you're looking for a summary of  recent updates from the Applications Technology Group, see:

    Lisa Parekh, Vice President Technology Integration, provides an overview of Oracle’s recent advances and product strategy for Oracle E-Business Suite technology. This is the cornerstone session for Oracle E-Business Suite technology. Come hear how you can get the most out of Oracle E-Business Suite by taking advantage of the latest user interface updates and mobile applications.  Learn about systems administration and configuration management tools for running Oracle E-Business Suite on-premises or in the cloud, and hear how the latest technologies can be used with Oracle E-Business Suite to improve performance, security, and ease of integration for your system. This material was presented at Oracle OpenWorld 2016.

    Oracle Database 12.2 for Exadata/SuperCluster available

    0
    0
    Oracle Database 12.2.0.1 on-premises for Exadata and SuperCluster is now available for download from Oracle eDelivery/SoftwareCloud. Connect to eDelivery and login: Search for: "Oracle Database" and mark "Oracle Database Enterprise Edition ..." . Click on "Select... [Read More]

    Introducing Oracle Data Integrator Cloud Service (ODICS)!

    0
    0

    We are pleased to announce and welcomeOracle Data Integrator Cloud Service (ODICS)

    Read the press release:  Oracle Launches Cloud Service to Help Organizations Integrate Disparate Data and Drive Real-Time Analytics.

    Overview

    Oracle Data Integrator Cloud Service (ODICS) delivers high-performance data movement and transformation capabilities with its open and integrated E-LT architecture and extended support for Cloud and Big Data solutions. Oracle Data Integrator Cloud Service provides all of the functionality included in Oracle Data Integrator Enterprise Edition in a single heterogeneous Cloud Service integrated with the Oracle Public Cloud. Providing an easy-to-use user interface combined with a rich extensibility framework, Oracle Data Integrator Cloud Service improves productivity, reduces development costs and lowers total cost of ownership among data-centric architectures. Oracle Data Integrator Cloud Service is fully integrated with Oracle Platform as a Service (PaaS) offerings such Oracle Database Cloud Service, Oracle Database Exadata Cloud Service and/or Oracle Big Data Cloud Service to put data and value at the center of the enterprise. Oracle Data Integrator Cloud Service is open and standards-based such that it can work with 3rd party systems as well as Oracle’s solutions.


    Cloud E-LT Architecture for High Performance

    Oracle Data Integrator Cloud Service's E-LT architecture leverages disparate relational database management systems (RDBMS) or Big Data engines to process and transform the data. This approach optimizes performance and scalability and lowers overall solution costs.Instead of relying on a separate, conventional ETL transformation server, Oracle Data Integrator Cloud Service’s E-LT architecture generates native code for disparate RDBMS or big data engines (SQL, HiveQL, or bulk loader scripts, for example). The E-LT architecture extracts data from the disparate sources, loads it into a target, and executes transformations using the power of the database or Hadoop. By leveraging existing databases and big data infrastructures, Oracle Data Integrator Cloud Service provides unparalleled efficiency and lower cost of ownership. By reducing network traffic and transforming data in the server containing the target data, the E-LT architecture delivers the highest possible performance for Cloud environments.

    Heterogeneous Cloud Support

    Oracle Data Integrator Cloud Service provides heterogeneous support for 3rd party platforms, data-sources, data warehousing appliances and Big Data systems. While Oracle Data Integrator Cloud Service leverages optimizations for Oracle Database and Big Data Cloud Services to perform E-LT data movement, transformation, data quality and standardization operations, Oracle Data Integrator Cloud Service is fully optimized for mixed technologies including: sources, targets and applications, etc.


    Knowledge Modules Provide Flexibility and Extensibility

    Knowledge Modules are at the core of the Oracle Data Integrator Cloud Service’s architecture. They make all Oracle Data Integrator processes modular, flexible, and extensible.Knowledge Modules implement the actual data flows and define the templates for generating code across the multiple systems involved in each data integration process. Knowledge Modules are generic, because they allow data flows to be generated regardless of the transformation rules. At the same time, they are highly specific, because the code they generate and the integration strategy they implement are explicitly tuned for a given technology. Oracle Data Integrator Cloud Service provides a comprehensive library of Knowledge Modules, which can be tailored to implement existing best practices ranging from leveraging heterogeneous source and/or target systems, to methodologies for highest performance, for adhering to corporate standards, or for specific vertical know-how. By helping companies capture and reuse technical expertise and best practices, Oracle Data Integrator Cloud Service’s Knowledge Module framework reduces the cost of ownership. It also enables metadata-driven extensibility of product functionality to meet the most demanding data integration challenges.

    Oracle’s Data Integration solutions provide continuous access to timely, trusted, and heterogeneous data across the enterprise to support both analytical and operational data integration.We look forward to hearing how you might use Oracle Data Integrator Cloud Service within your enterprise.


    Converting ADLs to implement end to end JSON in SOA Suite 12.2.1 -PART I by Luis Augusto Weir

    0
    0

    clip_image002

    There is no doubt that web [Rest] APIs have become extremely popular and its usage has gone well beyond just building APIs in support of mobile apps. We can see the adoption of resource-oriented architectures (ROA) by probably all SaaS vendors who provide out-of-the-box APIs as the means to connect and interact with their cloud applications. Take for example the Oracle Cloud. To discover and consume publicly available Oracle SaaS APIs, all one need to do is browse the Oracle API Catalog Cloud Service (which is publicly accessible) and just select the Swagger definition for any given API.

    But (as you probably already know) the adoption of web APIs hasn't stopped there.  With the increased popularity of Microservice Architectures , initiatives such as Open Legacy ,  and node.js based frameworks like loopback and sails (to name a few), API-enabling system of records is becoming a lot easier.
    This is putting a lot of pressure in software vendors to quickly modernise their integration suites to natively support the technology-stacks and patterns prevalent in these type of architectures. For example, if an organisations mobile application needs to interact with a system of record (on premise or the cloud) that already exposes a web API, the integration stack should be capable of supporting JSON over HTTP end-to-end without having to convert to XML back and forth. Not only is this impractical but introduces more processing burden to the core stack...
    Luckily for many Oracle's customers and Oracle Fusion Middleware / Oracle PaaS practitioners like myself, with the latest release of Oracle SOA Suite (12.2.1) , one of the many new features introduced is the support for handing JSON end-to-end.  I don't want to understate the importance of this as with such feature it is possible to use BPEL for example to orchestrate several APIs (all in native JSON and also in-memory with the new SOA in-memory feature) and therefore deliver coarse grained business APIs that actually perform.
    For me this represents an important milestone for Oracle SOA Suite as it shows the departure from traditional SOA tech-stack and into SOA 2.0 (as I like to call it) as the suite is now better suited to support the adoption of ROA, microservices, IoT, and so on. Having worked with SOA Suite since 10.1.3.1 this is very exiting. Read the complete article here.

    SOA & BPM Partner Community

    For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center.

    BlogTwitterLinkedInimage[7][2][2][2]Facebookclip_image002[8][4][2][2][2]Wiki

    JET Composite Components XIV - A Pattern for Multi-view Components using ojModule

    0
    0

    Introduction

    In general, one would expect components to be fairly limited in scope and only have to define a single UI (albeit one that might be reasonably dynamic as discussed in this previous article). However, the occasion may arise where the component needs to be more complex and maybe needs some scoped sub-views. A good example of this might be a wizard-style UI in which you need to encapsulate several distinct "pages" to step though, into a single component.

    This article outlines an approach you can take to implementing this kind of Composite Component by leveraging the capabilities of JET modules (ojModule). I would emphasize that this approach is only needed when your sub-views within the Composite Component all need their own viewModels. If all you want to do is define a number of views all based off of the Component viewModel then you can stick to Knockout templates to switch the UI.

    What's The Problem?

    For this scenario I want to package up the module views and viewModels as part of the Composite Component. So for example I might have the following file structure:

          /js
            /components
              /ccdemo-wizard
                loader.js
                ccdemo-wizard.js
                ccdemo-wizard.css
                ccdemo-wizard.json
                ccdemo-wizard.html
                /modules
                  /views
                    ccdemo-wizard-page1.html
                    ccdemo-wizard-page2.html
                    ccdemo-wizard-page3.html
                  /viewModels
                    ccdemo-wizard-page1.js
                    ccdemo-wizard-page2.js
                    ccdemo-wizard-page3.js
        
    Normally when using ojModule, the location of the views and viewModels for a named module are global to the application and defined by convention as /js/views and /js/viewModels respectively. Now obviously, a particular Composite Component cannot go around changing the default load locations for modules, that would possibly break the consuming application. So we need to find a way to direct ojModule to load each wizard page from within the Composite Component folder structure. As well as that requirement, we need to ensure that the life-span of each viewModel that makes up the wizard pages matches that of the Composite Component as a whole and that we can successfully use multiple instances of the component concurrently without running into problems.

    The Approach

    The approach I'm recommending here fulfills all of these requirements by taking advantage of the viewModelFactory capability of ojModule. The use of a factory to create the viewModel for a particular module is then coupled with a facility, also supported by ojModule, for a modules viewModel to expose a method to generate or otherwise obtain the view portion dynamically. Using these two capabilities together we can load the view and viewModels we want to use from anywhere, including within the Composite folder structure.

    Let's step through the process.

    Step 1 - Define the Core Composite Component

    To illustrate this I'll start off with a basic component called ccdemo-wizard here's the metadata:

        {
          "name":"ccdemo-wizard",
          "version" : "1.0.0",
          "jetVersion":">=2.2.0",
          "events" : {
              "wizardCancelled": {
                  "description" : "Called if the user presses the Cancel button in the wizard",
                  "bubbles" : true,
                  "cancelable" : false
              },
              "wizardComplete": {
                  "description" : "Called if the user presses the Finish button in the wizard",
                  "bubbles" : true,
                  "cancelable" : false
              }
          }
        }
        
    As you can see it's pretty simple and really only defines two events to allow the consumer to detect the overall state of the wizard. The component will also have a very standard loader script (loader.js) and style sheet (ccdemo-wizard.css). There is nothing unusual about these so I've not reproduced them here.

    Step 2 - The Component View Template

    Next, the HTML template for the Composite Component itself (ccdemo-wizard.html). This just contains a module binding and the buttons to control the paging though the wizard and to raise the wizardCancelled or wizardFinished events:

    <div class="oj-panel-alt5">
          <div data-bind="ojModule: wizardModuleSettings"/>
          <div>
            <button data-bind="click: previousPage, enable : previousEnabled">Previous</button>
            <button data-bind="click: nextPage, enable : nextEnabled">Next</button>
            <button data-bind="click: cancelWizard">Cancel</button>
            <button data-bind="click: finshWizard">Finish</button>
          </div>
        </div>
        
    So the key thing here is the <div> with the data-bind to ojModule. This binds the ojModule to an object called wizardModelSettings in the Composite Component viewModel.

    Step 3 - The Composite Component ViewModel

    I'll start off by listing the code for the Composite Component viewModel and then I'll break down each part:

          define(
              ['ojs/ojcore','knockout','jquery',
               './modules/viewModels/ccdemo-wizard-page-1',
               './modules/viewModels/ccdemo-wizard-page-2',
               './modules/viewModels/ccdemo-wizard-page-3',
               'ojs/ojmodule'
              ], function (oj, ko, $, Page1Model, Page2Model, Page3Model) {
              'use strict';
              function CCDemoWizardComponentModel(context) {
                  var self = this;
                  self.composite = context.element;
                  self.currentPage = ko.observable(1);
                  self.pageArray = [new Page1Model(this),
                                    new Page2Model(this),
                                    new Page3Model(this)];
    
                  self.wizardModuleSettings = ko.observable(
                      {
                        createViewFunction:'resolveModuleView',
                        viewModelFactory: self.modelFactory
                      });
    
                  self.modelFactory = {
                              createViewModel: function(params, valueAccessor)
                              {
                                return Promise.resolve(self.pageArray[self.currentPage()-1]);
                              }};
    
                  self.nextEnabled = ko.pureComputed(function(){
                      return self.currentPage() < 3;
                  });
    
                  self.previousEnabled = ko.pureComputed(function(){
                      return self.currentPage() > 1;
                  });
              };
    
    
              CCDemoWizardComponentModel.prototype.cancelWizard = function(viewModel, data){
                  var eventParams = {'bubbles' : true,'cancelable' : false};
                  this.composite.dispatchEvent(new CustomEvent('wizardCancelled',eventParams));
              };
    
              CCDemoWizardComponentModel.prototype.finshWizard = function(viewModel, data){
                  var eventParams = {'bubbles' : true, 'cancelable' : false};
                  this.composite.dispatchEvent(new CustomEvent('wizardComplete',eventParams));
              };
    
    
              CCDemoWizardComponentModel.prototype.nextPage = function(viewModel, data){
                  this._changePage(this.currentPage() + 1);
              };
    
              CCDemoWizardComponentModel.prototype.previousPage = function(viewModel, data){
                  this._changePage(this.currentPage() - 1);
              };
    
              CCDemoWizardComponentModel.prototype._changePage = function(pageNo){
                  this.currentPage(pageNo);
                  this.wizardModuleSettings.valueHasMutated();
              };
    
              return CCDemoWizardComponentModel;
          });
        
    Let's break that down:

    Module viewModel Imports in the Define Block

    The first thing that this model does is to define the viewModels for each of the separate modules that will make up the wizard (three in all). By using the define block in this way, we are automatically loading the files (e.g. ccdemo-wizard-page-1.js) from a location relative to the Composite Component Folder, rather than from the default module location of /js/viewModels. Each viewModel constructor for the modules is being mapped into the main function block as Page1Model, Page1Model etc. We'll look at the definition of those classes in a moment.

    Set up a currentPage Observable

    We'll need to keep track of which "page" and therefore module, should be displayed. This is done by creating the currentPage observable, set initially to the value 1

    Create Instances of Module viewModels

    Next we create the array pageArray which will hold a concrete instance of a viewModel for each of the modules that can be loaded into the wizard. Notice how a reference to the main Composite Component viewModel is passed as a this reference into the constructor for each veiwModel. This will allow communication between the page module and the component as a whole.

    Note that you could of course make this instantiation lazy should you want to, I've just gone for simple and clear in this example.

    Defintion of wizardModuleSettings

    The wizardModuleSettings object is the object being reference from the data-bind statement for the module. Its job is to configure the ojModule using the available options. In this case we are specifying two key bit of information:

    1. A viewModelFactory to tell ojModule how to obtain an instance of the viewModel for the module
    2. createViewFunction to tell ojModule the name of a function to call in the supplied module viewModel to obtain the matching view for the module

    You could define other settings here as well (except name and viewName which we have removed the need for by sourcing the information via the factory). Notice how the wizardModuleSettings variable is defined as a knockout observable. This is important because we want to have a way to automatically refresh the ojModule binding to switch pages as we step through the wizard - we'll see how that is done in a moment.

    Defintion of the Factory

    The viewModelFactory property of the wizardModuleSettings is pointing to the next object defined in this class: modelFactory. The expectation is that whatever object is pointed to by viewModelFactory should provide a method call called createViewModel. This object and method will be called to create / obtain the new viewModel to use for the module. We already have an array of possible viewModels ready to go in the pageArray, so all my factory createViewModel method has to do is to select the correct one based on the value of the currentPage and return it.

    The framework expects the return value from the createViewModel to be a promise, so we just wrap the correct viewModel instance for the correct page up in one using Promise.resolve(...). The fact that a promise is expected here is actually useful as it means that you could, as a variation to the pattern, dynamically load a viewModel using require() and still have it all work.

    And the Rest

    There is, in fact, not an awful much more of interest in the Composite Component viewModel. The remaining functions shown are all concerned with either raising the supported events or managing the navigation through the pages.

    The one remaining bit to discuss though is inside of the _changePage() function which is called when moving forward or backwards through the set of pages. Note how this calls valueHasMutated() on the wizardModuleSettings observable. Although the actual values inside of that object have not changed at all, by telling knockout that it has changed the data-bind for the ojModule will be re-evaluated and the correct module loaded in the process.

    Defining the moduleViewModels

    So, we have injected constructors for each of the wizard pages into the Composite Component viewModel. Each of these viewModels will of course have their own settings internally depending on the data that they need to deal with. However, they will all have to share the same essential backbone. I'll use the Page1Model to illustrate this:

        define(
          ['ojs/ojcore','knockout','jquery','text!../views/ccdemo-wizard-page-1.html'
          ], function (oj, ko, $, moduleView) {
            'use strict';
            function CCDemoWizardPage1ViewModel(componentVMRef) {
              var self = this;
              self.parentComponentVM = componentVMRef;
            };
    
    
            CCDemoWizardPage1ViewModel.prototype.resolveModuleView = function() {
                return moduleView;
            };
    
            return CCDemoWizardPage1ViewModel;
        });
      

    Injection of the Module View

    In the define block for this class, I inject the relative location of the matching HTML template for the module. In this case, the location will be relative to this viewModel script. This view is loaded using the requirejs text plugin and stored in the moduleView parameter.

    Constructor Function Defintion

    Recall from main Composite Component viewModel listing, that we instantiate instances of each module viewModel, passing in a back-reference to the Composite Component viewModel in the process. This is so that the wizard page can write state back to the main component if needed. Accordingly, we need to add a parameter to its viewModel constructor CCDemoWizardPage1ViewModel(componentVMRef). This is the stored in parentComponentVM for later use.

    Defintion of resolveModuleView Function

    The configuration that we passed to ojModule specified a value for createViewFunction which I hardcoded to the string 'resolveModuleView'. This means that the framework will try and call a function with this name in the supplied viewModel, so we just need to implement that. Fortunately this function can be really simple because requireJS has already done the hard work to load the view HTML for us. We just need to return the moduleView parameter populated by requireJS.

    That's It

    So you now have the core pattern to follow if you need to create these sorts of multi-module Composite Component. There are, of course, many possible variations in the way that this pattern can be used or adapted, however, the basic use of the ojModule viewModelFactory is going to be a key part of any such strategy.


    CCA Series Index

    1. Introduction
    2. Your First Composite Component - A Tutorial
    3. Composite Conventions and Standards
    4. Attributes, Properties and Data
    5. Events
    6. Methods
    7. The Lifecycle
    8. Slotting Part 1
    9. Slotting Part 2
    10. Custom Property Parsing
    11. Metadata Extensibility
    12. Advanced Loader Scripts
    13. Deferred UI Loading
    14. Using ojModule in CCAs

    OGG Custom Adapters: How to include a unique identifier for every record in custom adapter?

    0
    0
    1. Add redo attributes to be captured by the primary Oracle extract process, so that there will be additional tokens written in the trail for each captured record.
      a) Add the user token. Now use the actual @GETENV() in extract parameter file to retrieve those desired attributes.Then inject the combination of redo seqno and rba of the actual source record into the trail file.  The redo seqno and rba will be unique for each single source record.  
      Note: In case of RAC source system, one additional attribute: redo thread id, can also be added to achieve uniqueness across all RAC instances.
      b) Add in the mapping specification of each captured table, particularly in the TOKENS clause.  
      Refer to documentation: https://docs.oracle.com/goldengate/1212/gg-winux/GWUAD/wu_datainteg.htm#GWUAD468,  section 12.13.1 Defining Tokens should help.  
      For eg: TABLE src.table, TOKENS ( redoseq = @GETENV(‘RECORD’, ‘FILESEQNO’), redorba = @GETENV(‘RECORD’, ‘FILERBA’), redothread = @GETENV(‘TRANSACTION’, ‘REDOTHREAD’) );
    2. Once the trail for that table should now contain those extra attributes for each source DML record, the Big Data / Application Adapter component can retrieve it via theop.getToken(userTokenName)API invocation, and the custom implementation can do whatever it wants with it.

    Where do you specify the Date Format Mask

    0
    0
    When reviewing Oracle APEX applications I often see hardcoded date or timestamp formats.You can define your date formats in multiple places in your application. In your item or column attributes, as part of your code e.g.TO_CHAR(sysdate, ‘DD-MON-YYYY HH24:MI’) or if you want to make it more reusable you might create a... [Read More]

    Monday Spotlight: It's About Security - Oracle Exadata SL6

    0
    0

    Well, not really.  If you read my co-worker Gurmeet's Blog, you’ll see the Exadata SL6 is one incredible machine.  However, I want to talk about a very important aspect of the Exadata SL6 that I don’t think is getting enough play: its security features.

    We’ve been fighting the security battle for years now and it has become a booming business estimated at $445 billion in 2016.  You could even say that hackers are the new mafia.  But that’s just the business side of cyberattacks.  There’s also state-sponsored cyberattacks.  It’s really cyberwarfare and it’s playing out every day around the globe with every credit card transaction, every mobile phone call and every social media interaction.  Somewhere, someone is being cyberattacked while you read this.  It’s estimated that it costs the healthcare industry $200,000[1][2] every minute of every day worldwide.

    Now, I’m not trying to be alarmist.  But we need to talk about this. For decades, we’ve worked hard to protect our data centers by attempting to keep people out.  And that worked for a while.

    It’s much like the castles and keeps of the middle ages. Build a big, strong wall and keep the bad guys out.  However, much like those castles and keeps, building a strong wall around the data center has failed.  The castles had large doors or gates which had their own vulnerabilities.  So, they built moats. But even then, there were bridges so that the people could get in and out.  For today’s data center, we have layers of firewalls and Web servers.  And just like those bridges and gates, today’s Web servers are the gateways to commerce.

    Just like the keeps and castles of yesteryear, the strategy of “build a strong wall” has failed. Back then, spies, disguised infiltrators and even “backdoors” did the castles in.  Today, we have IoT and laptops and software bugs.  You can’t protect the perimeter enough when the very devices your people are using are the infiltration mechanisms. 

    So, we have to protect the entire data center.  There are three areas that need to be addressed or “Pillars of Protection”.  These are people, platform and data. 

    People are the most obvious risk and also might be the hardest to protect against.  Overly simple passwords and social engineering attacks, as well as spam, make it all too easy to get access to user accounts.  There are mechanisms to protect against this, but I’m going to leave that for another day. 

    Protecting the platform is critical. Software security vulnerabilities (CVEs) are going to be there.  We’ve seen many of them recently. Dirty COW being a particularly bad one.  So, you are now constantly patching CVEs in your data center.  But there are a few of problems with that.

    1.)   You aren’t patching all your severs.  I know this because the vulnerabilities being exploited are more than a year old every single year.[3]

    2.)   When you do patch it takes more than 3 months to do it. This is what our customers have told us.

    3.)   Once you’ve patched, you’ve finally closed the door on a vulnerability that has likely been there and exploitable for years. Heartbleed was there for 10 years before it was discovered.

    Chasing CVEs and patching them is a no win scenario.  We need to think about mitigating whole classes of vulnerabilities so we can stay secure while we fix the root cause of the vulnerability.  As it turns out, just 4 types of vulnerabilities make up about 2/3rds of all vulnerabilities[4].  Two of these, Code execution and overflow, can be stopped by what we call Silicon Secured Memory (SSM). 

    Silicon Secured Memory is a part of the Security in Silicon on the SPARC processor that is built into the Oracle Exadata SL6.   SSM colors memory as it is allocated and then verifies that the way the memory is being accessed has the same color as the memory is currently colored.  This means that a buffer overread or overwrite attack like Heartbleed can’t happen.  Certain types of code execution attacks can also be prevented with SSM as the act of writing to memory without the correct color will not be allowed.

    Now, no hardware feature can be of use unless the software running on that system utilizes it.  That’s why we’ve built the Oracle Database to take advantage of SSM on the Exadata SL6.

    So, Exadata SL6 excels at protecting itself and the software stack from overflow and execution attacks. And it does this with minimal performance overhead. 

    The third pillar of protection is the data. You have to protect it.  It’s what the cybercriminals and state sponsored bad actors are after.  However, encrypting data is expensive.  It’s expensive in that it consumes a large number of processor cycles to encrypt and decrypt it.  This means that you have to choose between performance or security.  This has been a long time struggle.  You had to decide what data absolutely needed to be encrypted and the impact of that on your business and expenses as the performance penalty meant buy more or bigger systems. 

    With the Exadata SL6, you no longer have to choose between performance and protection.  The SPARC M7 processors in the Exadata SL6 each have 32 decryption engines that can decrypt at the speed of memory. This allows you to simply encrypt all your data, and do it without the performance penalty.  Combining the M7 crypto engines with Oracle Database Transparent Data Encryption means that protecting your database and deciding which data in the database to encrypt has never been easier. Just encrypt all of it.

    The Exadata SL6 database servers run the Oracle Linux operating system making them simple to deploy in environments that are standardized on Linux. 

    The Exadata SL6 brings more than just 2x performance at the same price.  It brings new security capabilities; securing your data easily.

    Viewing all 19780 articles
    Browse latest View live




    Latest Images