WS-DREAM: A Distributed Reliability Assessment Mechanism for Web Services - PowerPoint PPT Presentation

About This Presentation
Title:

WS-DREAM: A Distributed Reliability Assessment Mechanism for Web Services

Description:

Title: PowerPoint Presentation Last modified by: CSE Created Date: 1/1/1601 12:00:00 AM Document presentation format: On-screen Show Other titles – PowerPoint PPT presentation

Number of Views:33
Avg rating:3.0/5.0
Slides: 19
Provided by: eduh1
Category:

less

Transcript and Presenter's Notes

Title: WS-DREAM: A Distributed Reliability Assessment Mechanism for Web Services


1
WS-DREAM A Distributed Reliability Assessment
Mechanism for Web Services
Zibin Zheng, Michael R. LyuDepartment of
Computer Science EngineeringThe Chinese
University of Hong KongHong Kong, China
DSN 2008, Anchorage, Alaska, USA, 25 June, 2008
2
Outlines
  1. Introduction
  2. Design
  3. Implementation
  4. Experiments
  5. Conclusion

3
1. Introduction
  • Service-Oriented Architecture (SOA) is becoming
    popular.
  • Usually built using Web services.
  • Reliability of the service-oriented applications
    becomes difficult to be guaranteed.
  • Remote Web services may contain faults.
  • Remote Web services may become unavailable.
  • The Internet environment is unpredictable.

We need to know whether the target Web services
are reliable or not before using them.
4
1. Introduction
  • Performance of Web services is different from
    different user locations.
  • Service-oriented applications may be deployed to
    different locations after developed.
  • Distributed reliability assessment on Web
    services is necessary.
  • Difficult.
  • Time consuming.
  • Expensive.

5
1. Introduction
  • WS-DREAM A Distributed REliability Assessment
    Mechanism for Web Services.
  • User-collaboration
  • YouTube sharing videos.
  • Wikipedia sharing knowledge.
  • WS-DREAM sharing assessment results of target
    Web services.
  • Obtain performance information of individual Web
    service from different locations for Web service
    selection and ranking.
  • Assess fault tolerance replication strategies.

6
2. Design
  • 1. Assessment request
  • 2. Load Applet
  • 3. Create test cases
  • 4. Test task scheduling
  • 5. Client get test plans
  • 6. Client run test plans
  • 7. Send back results
  • 8. Analyzing and return
  • final results to client.

7
2. Design
  • Fairness. Different Web Services should have fair
    chances to be assessed.
  • Distribution. Web Services should be assessed by
    users in as many geography locations as possible.
  • Feasibility. Task assignment should dynamically
    adjust to the frequently changed number of users
    and number of test plans.
  • Efficiency. The algorithm should be efficient and
    it should not slow down the testing progress.

8
2. Design
  • Identical and similar Web Services are becoming
    available in Internet ? redundant replicas for
    fault tolerance ? cheaper.
  • Basic replication strategies.
  • Parallel. The application sends requests to
    different replicas at the same time.
  • Retry. The same Web Service will be tried one
    more time if it fails at first.
  • Recovery Block (RB). Another standby Web Service
    will be tried in sequence if the primary Web
    Service fails.

  Parallel Retry RB
Parallel 1.Parallel 4.ParallelRetry 6. ParallelRB
Retry 5.RetryParallel 2.Retry 8.RetryRB
RB 7.RBParallel 9.RBRetry 3.RB
9
2. Design
  • 4. ParallelRetry 5. RetryParallel
  • 6. ParallelRB. 7. RBParallel
  • 8. RetryRB 9. RBRetry

10
2. Design
  • XML-based test plan design.
  • Assess performance of different replication
    strategies.
  • Includes several test cases.
  • Created by WS-DREAM server and executed in the
    client-side.

6. ParallelRB. 7. RBParallel
11
3. Implementation
  • JDK Eclipse
  • Client-side
  • Java Applet
  • Server-side
  • an HTTP Web site (Apache HTTP Server)
  • a TestCaseGenerator (JDK6.0 Axis library)
  • a TestCoodinator (Java Servlet Tomcat 6.0)
  • a MySQL database (Record testing results)

12
4. Experiments
  • A service user plans to employ several Web
    services in his commercial Web site.
  • Six identical Amazon book displaying and selling
    Web Service for fault tolerance purpose. (a-us,
    a-jp, a-de, a-ca, a-fr and a-uk)
  • A Global Weather Web Service to display currently
    weather information.
  • A GeoIP Web Service to get geography information
    of Website visitors.

13
4. Experiments
  • 1. Assess the reliability of individual Web
    Services.
  • Among all the 5443 failure cases
  • 2986 failure cases are due to timeout (of larger
    than 10 seconds)
  • 2456 failure cases are due to unavailable service
    (http code 503)
  • 1 failure case is due to bad gateway (http code
    502).

14
4. Experiments
15
4. Experiments
2. Measure the performance of different
replication strategies.
  • Strategy 1 (Parallel) provides the best RTT
    performance.
  • The sequential-type strategies (2Retry, 3RB,
    8RetryRB, and 9RBRetry) can provide good RTT
    performance in the normal environment, however,
    their performances are not so good in the faulty
    environment.

16
4. Experiments
3. Determine the optimal number of replicas.
  • Two replicas are enough to provide high
    availability in the normal Internet environment,
    while three replicas are needed to ensure high
    availability in the 5 faulty Internet
    environment.

17
5. Conclusion and future work
  • Conclusion
  • WS-DREAM
  • Reliability assessment of individual Web
    services.
  • Performance assessment of fault tolerance
    replication strategies.
  • Experiment
  • More than 1,000,000 test plans.
  • Users from five locations.
  • Web Services located in six countries.
  • Future work
  • Assessment of stateful Web services.
  • Enhancement of system feature in facilitating
    user test case contributions

18
WS-DREAM A Distributed Reliability Assessment
Mechanism for Web Services
Zibin Zheng, Michael R. LyuDepartment of
Computer Science EngineeringThe Chinese
University of Hong KongHong Kong, China
DSN 2008, Anchorage, Alaska, USA, 25 June, 2008
Write a Comment
User Comments (0)
About PowerShow.com