Proposal for New Work Item - PowerPoint PPT Presentation

1 / 12
About This Presentation
Title:

Proposal for New Work Item

Description:

Service Providers can use the benchmarks to compare performance of RFC 3261 devices. ... Vendors and others can use benchmarks to ensure performance claims are ... – PowerPoint PPT presentation

Number of Views:35
Avg rating:3.0/5.0
Slides: 13
Provided by: scottpo
Learn more at: http://www.ietf.org
Category:

less

Transcript and Presenter's Notes

Title: Proposal for New Work Item


1
  • Proposal for New Work Item
  • SIP PERFORMANCE BENCHMARKING
  • OF NETWORKING DEVICES
  • draft-poretsky-sip-bench-term-02.txt
  • draft-poretsky-sip-bench-meth-00.txt
  • Co-authors are
  • Vijay Gurbani of Lucent Technologies
  • Carol Davids of IIT VoIP Lab
  • Scott Poretsky of Reef Point Systems

67th IETF Meeting San Diego
2
Motivation
  • Service Providers are now planning VoIP and
    Multimedia network deployments using the IETF
    developed Session Initiation Protocol (SIP).
  • The mix of SIP signaling and media functions has
    produced inconsistencies in vendor reported
    performance metrics and has caused confusion in
    the operational community. (Terminology)
  • SIP allows a wide range of configuration and
    operational conditions that can influence
    performance benchmark measurements.
    (Methodology)

3
More Motivation
  • Service Providers can use the benchmarks to
    compare performance of RFC 3261 devices. Server
    performance can be compared to other servers,
    SBCs, and Servers paired with SIP-Aware
    Firewall/NATs.
  • Vendors and others can use benchmarks to ensure
    performance claims are based on common
    terminology and methodology.
  • Benchmark metrics can be applied to make device
    deployment decisions for IETF SIP and 3GPP IMS
    networks

4
Scope
  • Terminology defines Performance benchmark metrics
    for black-box measurements of SIP networking
    devices
  • Methodology provides standardized use cases that
    describe how to collect those metrics.

5
Devices vs. Systems Under Test
  • DUT
  • MUST be a RFC 3261 compliant device.
  • MAY include a SIP Aware Firewall/NAT and other
    functionality
  • BMWG does not standardize compliance testing.
  • SUT
  • A RFC 3261 compliant device with a separate
    external SIP Firewall/NAT
  • Anticipates the need to test the performance of a
    SIP-aware functional element that is not itself
    defined in RFC 3261

6
Overview of Terminology Draft
  • The terminology document distinguishes between
    Invite Transactions and Non Invite Transactions
  • Thus the document addresses the fact that the
    SIP-enabled network provides services and
    applications other than PSTN replacement
    services.
  • The equipment as well as the network needs to
    support the total load.

7
Benchmarks defined
  • The following benchmarks have been defined
  • Registration Rate
  • Session Rate
  • Session Capacity
  • Associated media sessions - establishment rate
  • Associated media sessions - setup delay
  • Associated media sessions - disconnect delay
  • Standing associated media sessions
  • IM rate
  • Presence rate

8
Complements SIPPING Work Item
  • SIPPING - Malas-draft-05 relates to end to end
    network metrics
  • BMWG Poretsky et al - terminology draft 02 and
    methodology draft 00, relate to network-device
    metrics
  • A network device performs work whether or not a
    registration succeeds or fails whether or not an
    attempted session is created whether or not a
    disconnect is successful. Whether or not a media
    session is created at all.
  • A SIP-enabled network also carries signaling
    traffic whether or not a media session is
    successfully created. For example, IM, Presence
    and more generally subscription services all
    require network resources as well as computing
    device resources
  • For this reason, we think that many of the bmwg
    metrics complement the malas draft and can also
    inform that document.

9
Methodology
  • Two forms of test topology
  • Basic SIP Performance Benchmarking Topology with
    Single DUT and Tester
  • Optional SUT Topology with Firewall/NAT between
    DUT and Tester when Media is present.
  • Test Considerations
  • Selection of SIP Transport Protocol
  • Associated Media
  • Session Duration
  • Attempted Sessions per Second
  • Need to complete test cases.
  • Looking for more test cases

10
Next Steps for Terminology and Methodology
  • Complete methodology
  • Incorporate comments from mailing list
  • Propose that BMWG make this a work item

11
BACKUP - Relevance to BMWG
  • -----Original Message-----
  • From Romascanu, Dan (Dan) mailtodromasca_at_avaya.
    com
  • Sent Sunday, June 25, 2006 600 AM
  • I believe that the scope of the 'SIP Performance
    Metrics' draft is within the scope of what bmwg
    is doing for a while, quite successfully, some
    say. On a more 'philosophical plan', there is
    nothing that says that the IETF work must
    strictly deal with defining the bits in the
    Internet Protocols - see http//www.ietf.org/inter
    net-drafts/draft-hoffman-taobis-08.txt. And in
    any case, measuring how a protocol or a device
    implementing a protocol behaves can be considered
    also 'DIRECTLY related to protocol development'.

-----Original Message----- From
nahum_at_watson.ibm.com mailtonahum_at_watson.ibm.com
Sent Friday, May 26, 2006 251 PM SPEC wants
to develop and distribute common code for
benchmarking, as is done with SPECWeb a
SPECJAppServer. That code can and should use the
standardized peformance definitions agreed to by
SIPPING and/or BMWG.
12
BACKUP - Industry Collaboration
  • BMWG develops standard to benchmark SIP
    networking device performance
  • SIPPING WG develops standard to benchmark
    end-to-end SIP application performance
  • SPEC to develop industry-available test code for
    SIP benchmarking in accordance with IETFs BMWG
    and SIPPING standards.

-----Original Message-----From Poretsky, Scott
Sent Thursday, June 22, 2006 800 PMTo
'Malas, Daryl' acmorton_at_att.com
gonzalo.camarillo_at_ericsson.com
mary.barnes_at_nortel.comCc vkg_at_lucent.com
Poretsky, ScottSubject RE (BMWG/SippingWG) SIP
performance metricsYes Daryl.  I absolutely
agree.  The item posted to BMWG focuses on single
DUT benchmarking of SIP performance.  Your work
in SIPPING is focused on end-to-end application
benchmarking.  It would be great (and I would
even say a requirement) that the Terminologies
for these two work items remain consistent with
each other.   
Write a Comment
User Comments (0)
About PowerShow.com