Joomla Slide Menu by DART Creations

line-gradient.jpg

 

Download the Complete DII COE Study Here

See also:  CORBA Performance Overview

The Defense Information Systems Agency (DISA) facilitates the collaboration of multiple branches of the U.S. military in setting standards for information systems. One major initiative at DISA was the Defense Information Infrastructure Common Operating Environment (DII COE).

The DII COE Real-Time Integrated Product Team (DII COE RT IPT) focused on standards for embedded and non-embedded real-time systems. It was important for defense projects using or targeting real-time technologies to participate in the DII COE RT IPT in order to ensure that their requirements were fully represented. Boeing Phantom Works worked under the direction of the DII COE RT IPT. One of their tasks was to study the Real-Time CORBA vendors and products. This trade study included a thorough benchmark of several aspects of three real-time CORBA products.

The real-time ORB products in the study include:

  • HARDPack by Lockheed Martin Federal Systems, Owego
  • ORBexpress by Objective Interface
  • TAO by Washington University

The study was approved for public release and is available for download. The goals of the study were to:

  • Benchmark CORBA ORB middleware products, and identify the products meeting or exceeding the standards for real-time technologies;
  • Identify the viable RT CORBA products for recommendation into the DII COE infrastructure;
  • Determine Real-Time standards compliance and interoperability with other ORBs;
  • Highlight qualitative results (i.e. customer satisfaction, documentation) for each vendor

The detailed data produced from the final benchmark was provided to each ORB vendor and not disclosed publicly. Boeing has left the further disclosure of this detailed data to the ORB vendors. Objective Interface will make the detailed data for ORBexpress available upon request. To get the detailed data for the other products you will need to contact the other ORB vendors directly.

Customer Service Satisfaction
In addition to the ORB performance benchmarks, the study also determined the quality and responsiveness of the Technical Support Department compare customer service satisfaction levels. These customer service levels were based on:

  • Ease of Use for Software Development
  • Ease of Use Compared with Other ORBs
  • Satisfaction with Vendor Response to Critical Problems
  • Satisfaction with Vendor Response to Routine Problems

The entire DIICOE benchmark study is available here. The study includes the results shown above, as well as results for other platform configurations and other CORBA data types. The DII COE benchmark also includes tests for interoperability, predictability, and a review of the differences in architectures and platform support between the three ORBs.

Benchmark Environment
Boeing ran the benchmarks on both sparc-sun-solaris and powerpc-vme-lynxos.
The Solaris tests were run on an isolated 10 Mb Ethernet segment between two UltraSPARC 1/170 systems each with a single 167 MHz UltraSPARC I CPU.
The LynxOS tests were run on the isolated 10 Mb Ethernet segment between a Motorola MV3600-2 PowerPC (LynxOS) and a Cetia PowerPC (LynxOS).

What the Tests Measured
The tests were run using seven types of IDL operations. Each operation had only one "in" mode parameter as follows:

Description Parameter Type Operation Type
OW Primitives

A single struct containing a single array of a simple type

One way
OW Records An array of a struct that containing three small arrays of simple types One way
OW Any A single type Any containing the "Records" data One way
CR Primitives A single struct containing a single array of a simple type Two way
CR Records An array of a struct containing three small arrays of simple types where the struct layout is aligned to a natural memory boundary Two way
CR NA Records An array of a struct containing three small arrays of simple types where the struct layout is not aligned to a natural memory boundary Two way
CR Any A single type Any containing the "Records" data Two way

Each of the above represents the averaging of multiple tests. For example, "OW Primitives" was the average of sending an array of CORBA octets, shorts, longs, doubles, and other simple types in a single struct.

Each test was run with a single client thread in four different scenarios:

Description Test Span O/S Delay Between Invocations
Scenario 1 client/server on same machine Solaris none
Scenario 1a client/server on same machine Solaris 70 milliseconds
Scenario 2 client and server machines separated by 10 Mb Ethernet Solaris none
Scenario 3 client/server on same machine LynxOS none
Scenario 3a client/server on same machine LynxOS 70 milliseconds
Scenario 4 client and server machines separated by 10 Mb Ethernet Solaris 70 milliseconds
Scenario 5 client and server machines separated by 10 Mb Ethernet LynxOS 70 milliseconds
Scenario 6 client and server machines separated by 10 Mb Ethernet Solaris &
LynxOS
70 milliseconds
Scenario 7 client and server machines separated by 10 Mb Ethernet LynxOS & Solaris 70 milliseconds

Only the results of Scenarios 1a, 3a, 4, and 5 were published in the report.

Some of the Results

The results of the average invocation times in the Scenario 1a tests are presented in the table below. All times are in milliseconds.

Scenario 1a Test Number of Argument Bytes Sockets HARDPack ORBexpress TAO
CR Primitives 144 0.230 1.090 0.364 0.977
24,016 0.642 2.893 0.958 1.603
CR NA Records 144 0.230 1.092 0.383 1.176
24,016 0.642 5.382 4.957 7.980
CR Any 144 0.230 NP 0.9 3.4
24,016 0.642 NP 24.3 78.9

* NP: not published

The results of the average invocation times in the Scenario 3a tests are presented in the table below. HARDPack was not measured on LynxOS for the final report. All times are in milliseconds.

Scenario 3a Test Number of Argument Bytes Sockets ORBexpress TAO
CR Primitives 144 0.148 0.268 1.148
24,016 1.476 1.886 3.453
CR NA Records 144 0.148 0.285 1.170
24,016 1.476 4.878 21.620
CR Any 144 0.148 0.595 2.771
24,016 1.476 12.950 52.632

Note that the socket tests only sent simple octet arrays not the data types listed in the tables above. Processing the actual data types listed would have introduced additional processing for the socket tests. The socket times are reiterated for each test for ease of comparison only.

Please reference the RT IPT report for further results.

Objective Interface's Analysis of the Results

  • The speed of all three products was well beyond the performance of most non-real-time ORB products in our experience.
  • For all of the tests listed above and in the full RT IPT final trade study ORBexpress performed better than both of the other real-time ORBs in all tests.
  • The ORBexpress results are consistent with our internal performance tests for the product.
  • While the Solaris version of ORBexpress GT used by Boeing (V2.1.3a) does not include several assembler optimizations that are currently present in other versions of ORBexpress (LynxOS, Windows NT, VxWorks, et al) the product performed within our expectations.