Title: Diciembre 2002
1Interoperabilidad e Integración IVPruebas de
Rendimiento yPetshop 2.0 .NET vs. J2EEMSDN
Regional Director
2The Middleware Company
- Provides advanced J2EE training and consulting
- Deep J2EE development experience and strong
server-side skills - Built and maintains TheServerSide.com
- Leading online J2EE community
- Focused on enterprise architects
3The Middleware Benchmark
- Approached Microsoft to participate in re-test of
J2EE vs. .NET performance - Spent 4 months testing J2EE and .NET performance
- New series of comprehensive benchmarks
- All results taken by and certified by Middleware
Company - Report available at http//www.middleware-company/
j2eedotnetbench - Downloadable code, test scripts, discussion forum
also available -
4Benchmarks Performed
- Web Application Benchmark
- Exercises 3-tier Web application
- Data-driven page creation, middle-tier business
logic, middle tier data caching, user session
management, transactions - Distributed Transaction Benchmark
- Tests .NET/COM vs. J2EE/JTA distributed
transaction management - Run for 24 hours to test reliability
- Includes performance (TPS) and price/performance
(/TPS) metrics - Web Services Benchmark
- Tests XML Web Services (SOAP 1.1) performance
- Tests application as SOAP server and SOAP client
5Products Tested
- Two leading, commercially available J2EE
application servers - Identified as J2EE Application Server A and B due
to license restrictions on benchmarks - Tested with latest supported JVMs, highly tuned
by experts for each application server - .NET Framework 1.0 on Windows 2000 AS
- .NET Framework 1.1 on Windows.Net Server 2003
- All tests performed using Mercury LoadRunner 7.5
6Report Highlights
- Significant time spent on tuning/configuring J2EE
application servers - 10 man-weeks spent per J2EE App Server attempting
to match .NET performance - Only 2 man-weeks spent tuning/optimizing .NET
benchmark application - J2EE Application Servers tested on both MS
Windows 2000 and Linux RH 7.2 - App Server A exhibits significantly better
performance on Windows 2000 vs. Linux - App Server B exhibits comparable performance on
both - Windows 2000 chosen for final test runs based on
this determination - J2EE Application Server A and B both run the
benchmark application, but exhibit completely
different performance characteristics - Middleware Company abandons Container Managed
Persistence (CMP) for performance reasons - Implements EJB BMP-based solution
- Must implement special Read-Only interfaces on
Entity Beans to avoid performance constraints - .NET 1.0 on Windows 2000 handily beats best J2EE
numbers - .NET 1.1 on Windows.Net Server 2003 widens the
gap with significantly better results
7Test Lab
8Web Application Benchmark
- Two scripts run in 50/50 mix
- Browse only script simulates users visiting and
performing searches, product detail viewing - Order script includes ad-hoc searching, and
ordering of random items - Order process involves 5 page checkout procedure
- Last step is distributed transaction
- Two phase commit to Orders database (order and
detail insertions) and Products database
(inventory update) - All data points include ramp up, settle out, ramp
down and data collection periods - Tested with image download on and image download
off - Full error checking/reporting for all test runs
- Large user loads placed on systems to determine
ability to scale across 2, 4 and 8 CPU systems -
9Web Application Benchmark
10Web Application Benchmark
11Web Application Benchmark
12Web Application Benchmark
13Distributed Transaction Benchmark
- Transaction-heavy test script exercises order
placement only - Each order placed results in distributed
transaction (tx) w/ 2-phase commit - Tx is managed by middle tier (app server)
- Measures .NET tx perf through COM Enterprise
Services - Measures J2EE tx perf through JTA
- Run at peak throughput for 24 hours
- Shows if throughput is sustainable
- Measures TPS, total orders, errors
- Shows relative reliability over 24 hour period
- Compare to ecPERF, which is only run for 30
minutes - Includes price/performance ratio in dollar cost
per TPS - Similar to price/perf metric in ecPERF and TPC-C
- Includes cost of application server software, OS,
and hardware cost - Does not include database software/hardware
costs, or maintenance/support costs
14Distributed Transaction 24 Hour Benchmark
15Distributed Transaction 24 Hour Benchmark
16Web Services Benchmark
- Test A
- Load generating clients make direct SOAP/http
request - Single Application Server hosts Web Service
- Well-formed XML Order object returned
- Test B
- Load generating clients make HTML/http request to
JSP/ASPX SOAP client application - Client application makes distributed SOAP request
to Web Service Host machine - Both SOAP client application and Web Service host
computer run same application server software
(interop cases not tested) - Well-formed XML Order object returned
- Order object formatted into HTML for display via
JSP/ASPX
17Web Services BenchmarkTest A Direct Activation
18Web Services BenchmarkTest A Direct Activation
19Web Services BenchmarkTest B Activation via
Proxy
20Web Services BenchmarkTest B Activation via
Proxy
21Conclusions
- Independent Middleware benchmark shows .NET
significantly outperforms J2EE as tested in two
leading J2EE App Servers - Web application hosting
- Transaction performance
- XML Web Service performance
- .NET offers significant cost savings over
market-leading J2EE Application Servers - Based on calculated price/performance metric
- .NET application proves more reliable than J2EE
counterpart, significantly easier to tune for
load - .NET 1.1 on Windows.Net Server 2003 offers
significant performance gains over .NET 1.0 on
Windows 2000 - Middleware report, source code and test scripts
available online - http//www.middleware-company.com/j2eedotnetbench/
22Forum de Desarrolladores Corporativos
INTEROPERABILIDAD E INTEGRACIÓN IV Pruebas de
Rendimiento y Petshop 2.0 .NET y J2EE