Posts tonen met het label SOA. Alle posts tonen
Posts tonen met het label SOA. Alle posts tonen

woensdag 28 april 2010

Oracle BPM 11G, less is more


Oracle has just launched the 'new' Business Process Management Suite (BPMN). Though special as it is, it has just been announced as a patch set of the Fusion Middleware stack. There is a reason for that. Oracle has been working now for years to implement the vision of creating a product stack that is based upon Complete, Open and Integrated. Opportunity driven but still with this clear vision in mind, and aimed at supporting the Application part of Oracle. This resulted in a continuous stream of software companies being taken over and components added to the stack in order to complete this stack. In order to avoid a mess, Oracle continuously evaluates what product parts are strategic, and advocates reuse to the max. Reuse of components across the Oracle 11G Fusion Middleware stack, like for instance a Database Adapter, is essential. It improves stability and predictability of the solution. BPM just is one of the components plugging into the stack and reuses all other components. Compared to one of its predecessors, BEA BPM, the BPMN product is stripped to its core functionality, less is more!
What is the 'new' BPMN stack? The core of BPMN is based upon the BPMN2.0 specification. The goal is to give one integrated view from architecture all the way to implementation. For every role in the Process life cycle the BPMN stack contains a component, which is more or less connected. All these components can work in unison, but also act separately.

In this blog we’ll detail out the first two phases, Design and Build. In later blogs we’ll elaborate on the other process life cycle phases.

Oracle offers three tools for the design.

Design on Enterprise level is executed by the IDS Scheer ARIS Design Platform. There is currently no connection between ARIS and the rest of the (new) BPMN stack, but it can be expected that Oracle will provide this in the (near) future.

High level process flow design can be done within the browser based Process Composer (Business Analyst type of tool). The Process Composer adds the ability to discuss the shape and structure of a process in a Business user friendly way with a rich set of functionality. This has huge potential for usage in Rapid Design Visualization type of user sessions.

The design can also been done within JDeveloper BPM Studio, which is more suited for developers. Typically this design activity in JDeveloper BPM Studio is done when a solid functional design already exists. Processes and process templates can be shared between Process Composer and the JDeveloper BPM studio.

BPM process development in JDeveloper BPM Studio is created as part of a SCA Composite.
Every component within the SCA composite can be developed in isolation which enables delivery of components by different development teams. This is a big advantage compared to the 'old' BEA BPM tool, where everything was contained in one project. The JDeveloper BPM Studio is very GUI oriented, where you drag-and-drop components on the process canvas. Swim lanes enable to separate Human Tasks, as part of the process, over different user roles. Simulation can help give insight in the behavior of a process, though simulation is an art of its own.
In the next blog we will elaborate on the Build and Deploy phase.

Oracle BPM group Capgemini The Netherlands, Léon Smiers, Alexander Bijl, Gert Jan Kersten
(a repost from Capping IT Off)

maandag 24 augustus 2009

Composing applications like Lego, SCA in a nutshell


Composition is all about combining functionalities exposed from different sources. Up until now SOA hasn't brought us the salvation of combining functionalities from disparate sources. There may be help on the way. Service Component Architecture or SCA offers a simple application model for wiring all functionality together.

Composition is about combining functionalities exposed within an application or from external sources. How is this implemented in programming languages:
Combining functionality within an application is standard practice in languages such as Java and .Net. The functionalities exposed and reused are fine grained, there lots of calls between objects and the communication is based upon a method type of call.
Remote invocations in the same language, from outside the applications, is hard to implement, it is error prone and mostly a lot of time is lost due to all connection and protocol complexity.
Crossing the border in a mutiple language environment calls for the SOA neutral language approach, typically done with Web Services. The usage model of a SOA type of call is completely different then in the case of combining functionality within an application. The exposed and reused functionalities are coarse grained, mostly the amount of calls between services is low and communication is based upon a document type of call. The implementation of Web Service invocation and handling, even though standard frameworks exists are still time consuming and all the error handling needs to take place within the application.
The complexity gets harder when the same service needs to be exposed with multiple communication protocols. Making a unified communication model based upon Web Services is unfortunately not the answer, due to the different usage model of the SOA and low level application type of calls. Web Services used for low level application calls, for instance, are killing for performance.

Service Component Architecture (SCA) offers an implementation model for combing functionality, from internal and external sources, into composites. It offers an environment which takes care of all the communication, or wiring, between the services. SCA is limited to combining application or data logic but is not aimed at the presention layer and the data persistency layer. SCA is a standard defined by the OSOA organization, and is backed up by the vendors IBM, SAP, Oracle (inclusing BEA), Iona and Sun.

Without going into detail here a short summary of SCA.
SCA provides four basis building blocks, Services, Components, Composites and Domain.
Service, Components and Composites are part of SCA Design Time environment, the Domain is the SCA run time environment.
The basic building block is the Component, this holds and implements functionality, which can be exposed via Services. SCA does not deal with with the programming part of the component, it only handles combining functionality together. The only thing which a developer needs to do is state in the component (for instance a Java Class) that at a certain point in the application external functionality is needed, this can be done via a Reference name added as an Annotation in the application.
The Composite is a virtual container (just an XML file) where Components are defined and wired together. Policies can be added to Components when specific requirements on the usage or communication are needed, for instance security or reliability.
SCA enables building applications easily just like combining Lego blocks. Components can be combined into composite. Composites can be reused into new composites. But be aware that the the chain is as strong as its weakest link. If anywhere in the composite there is component that is badly tuned, it affects the composite as a whole. Composites should therefore be accompanied by SLA's, what is expected of the Composite.
The run time environment of SCA is the Domain. Multiple composites can be contained into the Domain. The Domain takes care of all the communication handling and the policies. In contrast to the SOA principle of being vendor neutral, a Domain is based upon one language. This is based upon one of the lessons learned from SOA, that vendor neutrality for fine grained services is overkill from a design perspective and on run time can be resource intensive which can cause performance problems.
SCA uses a very pragmatic approach, when services and clients are written in the same language, there is no need for a language neutral representation. SCA handles all the communications between the objects, or wiring, in a way that is best suited for the language and the environment. When dealing in a hybrid distributed environment the communication is handled via Web Services.

What are typical use cases for SCA
(1) Data composites, in a SCA Data Composite, data coming from different locations can be combined into one composite and the composite returns the combined set.
(2) Application functionality Composite
(3) Process composites

SCA will be able to fullfill the promise of SOA as long as the basic principle, simplicity, will be preserved.

zondag 12 juli 2009

The risks of the SOA EDA promise, a collapse of the power grid of applications


Looking at history, where networks are created and can collapse, we need to learn in IT what the risk are of our attempts to unite the IT landscape with Service Orientation (SOA) and Event Driven Architecture (EDA). Examples in the financial world and the utility market should be a warning for IT of what could hit us. This calls for craftmanship of people in our profession who have the ability to look cross borders and control by means of Governance, in Design AND in Run time.

One example of a collapsing network is the current ongoing subprime mortgage crisis which still creates waves of unemployments, closing of companies an costing a lot of money. It is caused by globalization of the financial world, a would be believe that house prices would never go down, the growing dependancies of markets and the lack of control. At one point is became clear that the fundament of the economic rise, the ever increasing house price, was not a certainty anymore, and the network collapsed.

Another example is in the utility market. On august 14th, 2003 a disruption of the energy grid caused a blackout in the North Eastern part of the US and Canada, and hit 10M people. It started as massive power fluctuation in New York state and, in no time, affected all the areas. Here one flaw in one part of the energy grid caused the collapse of a very large part of the energy grid.

In IT we come from silo based architectures, every functionality (or set of functionalities) is contained in one silo, which may have integration points with other parts in the enterprise but stands essentially on its own. When one of these silos goes down it can cause a major problem for the enterprise, loss of sales, the going down of an essential production process, and in the end money and reputation damage. But at this point, it would NOT cause a cascading domino effect on other systems or enterprises. The promise of the SOA and EDA is the network of applications where everything can be connected and reused. Especially the reuse part can be the risk part, reuse with no insight in the relations can cause Single Points of Failures.

We are creating the power grid of applications for the future, that makes us responsible for making the right choices now.
One of these choices is to know the dependancies of the system components, for both in Design and in Run time.
At Design time, we need to be able to quickly identify all the dependencies in our landscape. This includes both the application types like custom build (based upon all types of technologies), package based, legacy and integration components. With the ultimate wish to be able to create an impact analysis list based upon all the touched components with just one push one the button. Another decision which is made at Design time, is whether we´d like to reuse an existing component, elsewhere in the enterprise landscape or even outside, and what the impact is if that (reused) component goes down. Based on that risk analysis we can decide whether to reuse that component or build it ourselves.
At Run time we need to have insight in the inter relations between all deployed components, and identify possible risk areas, for instance a service which is used by an ever increasing amount of applications.

Where should we start? In most environments people start of with Excel sheets, which in no time are not usable anymore because of the complexity, which calls for the need for proper Governance tooling (or at the basis a dependancy tracking tool). When you start with a Governance tool the environment is, of course, empty. Loading all the components and dependancies in your environment is time consuming and costly. This is the major problem why dependancy tracking tool, (SOA) Governance tools and Service Repositories are hard to start with in enterprises. The Business case at start is way too costly, even though we all know that on the long term it is essential for getting a grip on you IT environment. A diffentiator in this market will be the option to load an existing environment into the dependancy tracking tool repository.

vrijdag 12 juni 2009

The last piece of the process integration puzzle, looking at the UI part


Process integration is a subject that always comes back when you're defining a Software Architecture. There are different types of of Process integration, based upon integration partners involved, integration types, amount of users involved, connection type, length of the process etc. In the past loads of books have been written about process integration and a lot of patterns have been defined.
When looking at the Application to Application (A2A) and Human to Application (H2A) type of processes cross technology solutions have been defined in an XML type of language BPEL. BPEL is the conductor of a process and does call-outs to functionality by means of Web services and adapters to legacy systems.
The ease of functionality that BPEL offers for process integration however is not very usefull for a UI type of process like a close-out sequence of a book ordering process on the Web. These are typically hard-wired within UI frameworks, like in Java we do (or used to) in Struts or JSF. In these solutions there is no flexibity or reuse. Rearranging a UI process requires rework and redeployment, and on avarage there's not so much reuse of UI process components. When looking at a SOA perspective, you'd like to reuse (groups of) pages and handle them as services in order to create composite applications.
There is some movement in this market. Oracle came up in de Fusion Middleware 11 version with a UI Taskflow mechanism.
The basis for the Oracle Solution is Java Server Faces, or JSF, and is called ADF Taskflows. Components in these Taskflows are complete pages (JSF), page fragments (parts of JSF), Java call-outs and the most important one, the Taskflow definition. This Taskflow definition resembles the BPEL type of working, and is conductor using for handling a process from a UI perspective. It uses a plain XML file referencing all the needed components, or services.
Two types of Taskflow exist, the unbounded and the bounded taskflows.
Unbounded taskflows are just a predefined set of pages used to complete a task, this taskflow can be entered from all pages within the taskflow.
The Bounded taskflow acts more like a Service, only entry one entry point exists and zero or more exit points which can result in an output result. The inner behavior is 'private' and cannot be changed or interfered by the outside world.
Bounded taskflows are the most neat ones, I've been searching for these for quite some time now. Finally the possibility to reuse a group of pages within a new composite.
I hope the market picks up this initiative and creates a cross technology integration language for UI based process integration, just like we have now with BPEL.
For more information see

  • Oracle Application Development Framework ADF
  • Fusion Developer's Guide for Oracle, Part III Creating ADF Task Flows
  • Nice example pages
  • dinsdag 24 maart 2009

    Oracle Coherence experiences from the field


    Last week I had a small conversation about Oracle Coherence on Twitter (#OraCohMDM). Here a more in depth description about the experience we've got with this product. Last year we came across a couple of projects which were an excellent case for the usage of a grid database based on the middle tier.

    The first case was about processing bulk data in a small timeframe. Every minute some 150MB XML data was delivered (zipped) to our application by a Web Service. The requirement was that the data needed to be processed within 20 seconds and selected parts of the data needed to be send to to a bunch of subsribers. The architecture should be scalable for consumers and data load. For history reasons all the data needed to be stored in a database.

    The second case could be described a having a high CIA constraint (Confidentiality, Integrity and Availablity). Multiple providers send data via Web Service, the messages were, contrary to the first case, small in size. The amount of messages however is very much higher, some 10-100 messages per second.

    Since we know that (Oracle) databases are known to be stable, why can't we implement this with the help of a database? The answer is simple, there is a performance bottleneck between the middle tier and the database. In 'normal' situation, when dealing with for instance a web Based transaction application, the JDBC connection is able to do it's work properly. But when faced upon a situation with high data volume or high message volume, JDBC is clogging up. So how does a Grid database does a better job? Elements of success for a Grid database are the ability to scale up and down easiliy and minimize the risk of loosing data when for instance a server goes down. Preferrably there should be no master of the grid, all grid elements are equal. Simply put the Grid works as follows. A Grid is a series of interconnected nodes, which communicate via an optimized network protocol. Every Grid element has knowledge of the grid, what kind of work to do and the existence of their neighbours, just like bees in a beehive. When a new data object is entered in the Grid one Grid node takes responsibility and contacts another Grid node to be it's backup (preferably a node on another server). When a new node enters the grid, it communicates some sort of "Hello, I'm here", just like when you enter a party. All grid nodes start communicating with the new node, and (some) data is redistributed across the grid. This in order to minimize loss of data. When a node leaves the grid and again the (some) data is redistributed, this to avoid that a Data Object is only available on one node.
    Oracle obtained Coherence (Tangosol) in 2007 because of it's excellent middle tier Grid capabilities. We've performed tests with Oracle Coherence in different hardware and network configurations. Our tests have shown that Coherence nearly scales linearly, with an optimum of around one node (or JVM) per CPU core. The addition of the JRockit JVM even removes further a smart part of the performance.

    So how do you start with Oracle Coherence. Coherence is Java based, so a good knowledge of Java is essential. The data structure is stored as key/value pairs in maps (comparable to the Database tables). On each map a Listener can be placed, so an object change results into an event. By adding Object-Relational mapping (for instance with TopLink) you can percolate the newly added data into the database, or load the data from the database. By adding expiration time to the data within Coherence, you don't have to clean up your data.

    Where to get more information

    - SOA Patterns, deferred service state
    - Oracle Coherence
    - Oracle Coherence Incubator


    Thanks to Oracle PTS (especially Flavius Sana & Kevin Li)

    dinsdag 10 maart 2009

    Cross domain Business Rules, who has got ideas?


    We are moving away from a single application landscape towards a more distributed service oriented landscape. This has impact on the way we have to deal with Data Integrity. In the 'old days' we were confined to one dataset, in which we could check our data integrity by means of simple programming with triggers and constraints, or the addition of a Business Rule Engine which is bound to the datamodel. The only discussion we had was about the place where rules were checked (in the screen or in the database).
    I am currently doing (nice) work for a company where they are moving away from their ‘single application’ back-offices (one being USoft, a Dutch company which used to be known as the Business Rule company) towards a Service Oriented environment. The landscape is going to be separated in domains, based upon ownership of the different functionalities.
    Business Rule thinking is part of my clients DNA, so we really have to define a thorough solution when validating rules across domains.
    The rules of engagement are
    -Domains are independent
    -Transactions executed in one system may not be hindered by restrictions defined in another system
    -Business rules are defined and owned by the domains. These rules can be used across domains, but are always started by the domain who owns the rule.

    How do we deal with cross domain Business Rules?
    Consider for instance two domains, Relation and Education. Within the Education domain we have a business rule 'An active education can only be connected to an active relation'. It is the responsibility of the Education domain to uphold this rule. When the relation is de-activated within the Relation domain, this is allowed, since the Relation domain is not responsible for the Education domain. So at this point in time we have an invalid situation.
    How to deal with this?
    We could agree that the Relation domain sends information about the de-activation to a central messaging mechanism where other domains can subscribe for these type of messages (i.e. the Education domain). The Education domain is then responsible for taking actions in order to create a valid situation. This could be the closing of the Education belonging to the Relation. Part of the Governance of the Education domain is then that periodically should be checked if open educations exist which belong to closed relations. This to be sure that invalid situations have occurred and the subscription mechanism did not work right.
    The ease of checking our Business Rules and uphold the data integrity in a single application environment is gone in the distributed service oriented world. Who has got experiences/ideas how to deal with the cross domain validation problem?

    dinsdag 17 februari 2009

    The Data Refinery


    Two years ago I was involved in a RFID project in the logistics market in joint Capgemini and Oracle effort. Based upon the EPCIS Architecture we had to combine movement data of crates between different storage facilities and stores. Without going into details about the case, on the technology side the biggest challenge in RFID is in the huge amount of data and how to get data out which makes sense for a particular use case. We came up with a division of the life cycle of RFID data in three main segments (in discussion with Ard Jan Vethman) (1) In real time (within seconds), you want direct actions, a crate is entering a room where it is not supposed/expected to. (2) In short term (hours, days, weeks, depending on the business case), you want to perform actions based upon a condition based upon multiple measurements. for instance when a shipment of fresh food between a factory and a warehouse normally takes 4 hours, you want to send an an alert when the shipment takes a day. (3) In longer term (more then weeks), you want to see how your processes are optimized against the measurement which are received. You can look at this as a Data Refinery, bulk data is processed as it flows in (complex event processing), the result or residue is a high quality small subset of all the data which is entered, the exceptions from all that is ok. The residue is then used as events which can be send to a Business administrator and systems administrator. Oracle provides the following technologies which can be used for this functionality. For the longer term we've known BI for a long time. But what is provided in the real/short term. For the short term data Oracle Business Activity Monitoring (BAM) has now been around for some years, and provides excellent insight, as well as graphically as with alerts. For the real time processing we had to program the event processing ourselves, but recently Oracle came up with Complex Event Processing (CEP) as a COTS product. This enables querying data on a data stream. Getting insight in a near real time data stream and do some cherry picking is still a new and under developed area and is an exciting area.

    maandag 9 februari 2009

    Aim for Stability Based Architecture (SBA)


    Software Architecture is the art of translating Business needs into a suitable and working technical solution.
    The same as in architecting and creating houses our sponsor expects that the product will be stable and last for quiete a time. Looking back in time we see that at the time of Cobol, applications were able to last more then 20 years (and even now exist, even after the retirement of their creators). But with the speed of new developments and the coming of hype driven development the application lifespan (or time that it gets outdated), is getting less and less. Java, among others, has got a bad reputation in this area, for instance Struts lasted some 2 to 3 years, after which you were the laugh of the year when you decided to choose Struts in your solution. SOA is another example, finally it paid of working with it, when somebody decided that SOA is dead and we have to run after the latest hype, Cloud computing. The half-time of a Software Architecture is getting shorter and shorter, and it happens too often that at the time of go-live we almost have to think about upgrading, because the version is almost outdated.
    I'd like to propose a new architecture type, Stability Based Architecture (SBA), in which we aim for stable (core of the) application and add an expected life-time of the application. The core of the application should be stable and last for a long time, just like rock under pressure, it folds but doesn't break. Only then can we justify the costs of implementation.

    maandag 26 januari 2009

    The Return of Oracle Forms


    The Return of Oracle Forms

    Oracle Forms has been around now for some 20 years, it looked like to get extinct over time, but the financial crisis might give it second chance.
    Oracle Forms is a GUI used typically in transaction based applications for high volume data entry. Forms started of as a character based application in the 80ies, it evolved into a GUI client in the early 90ies and now can be deployed as a applet based application used within a browser. The basis of Oracle Forms hasn't changed much. A dedicated Forms process is connected to a Database. In the early days this Forms process was tightly connected to the GUI. In the current implementation the Form process is located on the Application Server and the GUI is located in a browser on the client. Big difference with a 'normal' Web Application is that the connection between the Forms Client and the Application Server is stateful whereas with a a 'normal' Web Application it is stateless. This articulates the type of usage, for every user a separate Forms Process will be started on the server, which consumes processing power and memory. In short, Forms is typically used by a known community, a Web Application can be used by an unknown community. When looking at application types, Oracle Forms is excellent for high volume data entry, and aligns with non functionals like minimal mouse usage etc. In terms of development effort Oracle Forms still is way more productive then any Java based application (roughly 4 hours per function point versus 8 hours).
    A couple of years ago the statement was that Oracle Forms is outdated, but slowly it gets back in the picture of Oracle.
    In The Netherlands we still have a large community of clients with (core) applications working on Oracle Forms (and not the latest version.....). So for those clients talking about SOA is shouting from over the horizon. And certainly in this financial climate new investments are hard to get.

    Currently the step towards SOA is too big (and expensive) for organizations, but revitalization of the core systems build based upon Forms will be a must the coming years. Forms can be used as stepping stone towards a SOA based architecture. Forms can nowadays be used as a channel service exposed in a Web Portal(see also ADF Forms) and also can call out to (Web)services. And Oracle is investing again in Forms Forms modernization.

    So is Oracle Forms going to get a second chance?

    donderdag 15 januari 2009

    Desiging XML, reuse is nice but aim at usage first!

    Designing Data has been, for ages, the cornerstone for all applications. It relates to all information needed and shared between the applications. This is no different when we look at the SOA world. Actually it starts to get even more important. First of all, for what data is a Service responsible? And secondly what information is shared between the Services.
    The first part is all about the definition of a data model, as (mostly) used in a database. This definition is used for storing the information and assuring validity. The second part is about the definition of XML documents, as used in the definition of Services boundaries, this data is more fluid floating around between services.

    Now what makes a good Data Designer? In the Database world it took us a while to learn how to define a thorough data model. It is not only about WHAT data is needed but also HOW the data is used. What type of queries are to be expected. What are the quantities of the data per functionality. How is the data related to one another, etc. A lot of 'old fashioned' techniques, like NIAM, helped is us in the past in defining a good data model.

    When I look at the XML world, to my surprise, a lot of the 'good old designing tricks' are forgotten. In projects I come across XML definitions (coming from standard organizations), which are bound to fail in practice. Yes, all the data needed for the functionality is there (the WHAT question). The HOW question, however, most of the times, is completely forgotten. In theory it looks very beautiful creating an XML tree containing all functionality needed in the world. In practice this means, most of the time, that the ratio 'actual needed data' to 'Total data send' is less the 5%. This puts a burden on performance and network load.
    It is excellent that one XML Document can contain all the information needed to share across all the services, but in practice this means that when you change one part of it, all services need to be changed.

    With the ever increasing need for exchanging data it becomes essential that we should aim for a methodology for designing XML documents. In the meantime, let's first use the KISS (Keep it Simple Stupid) principle, and see if the XML design really works in practice, on both a functional and usage level. Only after that start looking at reuse.....

    woensdag 7 januari 2009

    Plea for standardization on consumer good chargers!


    Once a month I do some travelling. Whether it be for business or fun I usually carry around my gear I need, laptop(s), phone, eBook reader, IPod, camera etc. All these appliances need to be charged once so may days. No problem with that, but what annoys me is that EVERY appliance needs a DIFFERENT charger in different type, size and weight. It gets even worse, there is no charger that I can reuse! And when I travel from country to country I also need adapters for power plugs, and even sometimes, adapters for adapters. The whole IT world has been buzzing about SOA talk (Service Oriented Architecture) with the basis that everything should be based upon Open Standards. But when you look at the basic infrastructure for consumer goods, every brand delivers it's own type of charger. What is the problem with defining a standard for charging consumer goods? Let's apply the SOA principles also here! It will save me (and all other travelers around the world) from carrying around a surplus of wires, adapters and chargers.

    Thanks in advance!

    Link

    • www.elzmiro.com

    Visitors