SBM Orchestration Guide → Orchestration Best Practices → Scalability
This section provides best practices for designing orchestration
workflows to maximize performance and scalability.
- An application that performs large-scale data transformations will
not scale well if it is implemented using an orchestration. Instead of using a
single orchestration to process a large number of items, break the work into
smaller chunks or consider using another method to process the data. In
general, use orchestrations only to enable collaboration between systems and
- Use orchestration workflows to solve the following business problems:
- Collaboration with existing legacy systems: If a legacy
system can receive Web service calls, it can be called from asynchronous
orchestration workflows that are executed during transitions in an application
It is recommended that you use asynchronous
(not synchronous) orchestration workflows to do this. See
synchronous orchestration workflow limitations.
- Collaboration with SOA capable systems: Application
workflows (also known as "human workflows") can easily be integrated with other
SOA (service-oriented architecture) systems. These integrations use
orchestration workflows initiated from the human workflow to call Web services
for passing data to and receiving responses from the external systems. In
addition, because the Event Manager can be called as a Web service,
asynchronous orchestrations can be initiated from events occurring in the
external systems to communicate back to the
- Intelligent data enrichment: Users make
decisions and take actions based on information available within the system.
Often they can make better decisions if they have access to related information
stored outside of the system. A synchronous orchestration workflow can
implement business logic to collect and coordinate data from several sources to
present in a form. For example, a form in a credit approval application can
include a customer's credit information.
Synchronous orchestration workflows that use programming
constructs such as comparisons, string and number manipulation, and loops can
be used to validate data that users entered. Validation can also occur through
- Process-level synchronization: In this scenario, the data
between two or more systems is continuously synchronized as items flow through
various business processes. Another synchronization scenario is not suitable
for orchestration workflows and is described below.
- Do not use orchestration workflows to perform the following tasks:
- Visual programming:
provides the ability to visually create orchestration workflows using
constructs such as loops, decisions, data manipulation, and fault handling.
While it is possible to use these constructs as a general purpose programming
language to create complex procedural programs, it is discouraged. It is better
to use these constructs to apply business logic while coordinating or
orchestrating the Web services called in the workflow.
Drawbacks of adding long modules within orchestration workflows
include: orchestration workflows become too long, so they are difficult to
view, debug, and maintain; designers lose the benefit of compile time checks
that other languages such as Java offer; and performance of these constructs is
nowhere near as fast as that of traditional programming languages. Complex
logic or data manipulation should be moved out of orchestration workflows and
into Web services that can be invoked as a part of the orchestration workflow.
- ETL processing: Data warehousing Extract, Transform, and
Load (ETL) processing is data intensive and requires minimum overhead for each
record and very fast data transformation processing. Implementing ETL with an
orchestration workflow entails at least two Web service calls for each record
that is processed, as well as the overhead for each record for data
transformation. Using an orchestration workflow for ETL processing results in
poor performance and unacceptably complex visual programming.
- Batch process synchronization: In this scenario,
synchronization takes place after the fact in a batch process. It involves
processing a large number of records in multiple systems to ensure that they
remain synchronized after changes to one or more system. An example of this is
synchronizing a customer or product master database with data stored in
distributed systems. To achieve this using an orchestration workflow, you would
need to iterate through each record on each system, making multiple Web service
calls for each iteration; comparing the data between systems, and if necessary,
update the data by making additional Web service calls.
Copyright © 2007–2018 Serena Software, Inc., a Micro Focus company. All rights reserved.