WPP Study Group Tutorial - PowerPoint PPT Presentation

About This Presentation
Title:

WPP Study Group Tutorial

Description:

Based on two-step approach to definitions: ... the time that APs and devices take to complete the four step roaming process. Test Configuration ... – PowerPoint PPT presentation

Number of Views:127
Avg rating:3.0/5.0
Slides: 62
Provided by: robert1466
Learn more at: https://www.ieee802.org
Category:

less

Transcript and Presenter's Notes

Title: WPP Study Group Tutorial


1
WPP Study Group Tutorial
  • Group created at Vancouver meeting to create test
    definitions and methods for 802.11
  • Presentations today will cover WPP from several
    points of view
  • Large user
  • IC vendor
  • Testing community

2
Agenda
  • Bob Mandeville Iometrix
  • Don Berry Microsoft
  • Mike Wilhoyte Texas Instruments
  • Kevin Karcz UNH-IOL
  • Jason A. Trachewsky Broadcom
  • Fanny Mlinarsky Azimuth Systems
  • Round Table Discussion and QA

3
802.11 Requirements forTesting Standards
  • Bob Mandeville
  • bob_at_iometrix.com

4
WPP
  • What is the need for 802.11 metrics
  • What problems will they help solve?
  • Who will the primary users be?
  • How do we go about creating new metrics for
    wireless?

5
Two Approaches to CreatingTesting Standards
  • IETF (BMWG)
  • Based on two-step approach to definitions
  • Terminology document (all relevant functional
    characteristics are defined)
  • Methodology document
  • This method is most appropriate for performance
    testing
  • ATM Forum, Metro Ethernet Forum
  • Based on ratified standards documents
  • Each test definition is referenced to standards
    source text
  • This method is most appropriate for conformance
    testing

6
IETF BMWG Test Standards Templates
  • Terminology Definition Template
  • Term to be defined. (e.g., Latency)
  • Definition The specific definition for the term.
    Discussion A brief discussion about the term,
    it's application and any restrictions on
    measurement procedures.
  • Measurement units The units used to report
    measurements of this term, if applicable.
  • Issues List of issues or conditions that effect
    this term.
  • See Also List of other terms that are relevant
    to the discussion of this term.
  • Methodology Definition Template
  • Objectives
  • Setup parameters
  • Procedures
  • Measurements
  • Reporting formats

7
Conformance OrientedTest Methods Template 1/2
Test Name Name derived from reference document
Test Definition ID A punctuated alphanumeric string assigned to each defined requirement and test procedure couple using the following convention one to three letter abbreviated source document name . section number - paragraph number in the section from which requirement is derived. This number always figures as the last number of an ID. Ethernet Services ModelM Ethernet Services DefinitionsS Traffic and Performance Parameters for SLSsT. Example M.6.1-4
Reference Document Source Reference document and section (and paragraph when useful for clarity)
Test Type Functional, Conformance, Interoperability or Performance
Test Status Normative, optional, additional
Requirement Description Brief description of the service requirement that the device must or should satisfy
Description of DUT/SUT Type of Ethernet frame forwarding Device Under Test (DUT). Common designations used in this document are CE Device (Customer Equipment Device) UNI Device (User Network Interface Device) MEN Device (Metro Ethernet Network Device). A UNI device may be considered as a kind of MEN border device.
8
Conformance OrientedTest Methods Template 2/2
Test Object Succinct description of test purpose
Test Bed Configuration Succinct description of test bed configuration
Test Procedure Succinct description of the test procedure with mention of the test stimulus and expected output
Units Units can be time units, rates and counts in integers such as milliseconds, frames per second and numbers of frames transmitted or received. For the most part units used are defined in RFCs 2285, 2544, 2889
Variables Variables such as frame length, test duration, numbers of interfaces under test must be described
Results Description of the textual, numerical and/or graphical format in which to display test results
Remarks Description of any particular observations that might effect the test result
9
Sample List of 802.11 Terms to be Defined by
Category

10
Requirements forTesting Standards
  • Roaming Test
  • A practical example which shows poor performance
    related to a lack of test standards definitions
  • To roam a 802.11 a/b/g device will
  • disassociate from one AP
  • search for a stronger RF signal from another AP
  • then associate and authenticate with that A
  • resume normal data transmission
  • Roaming can fail due to
  • transient RF conditions
  • the time that APs and devices take to complete
    the four step roaming process

11
Test Configuration
12
Test Procedure
  • Roaming Test recorded
  • Total Roaming Time Decision Time Switch Over
    Time
  • The Decision Time is the time it took the NIC to
    stop attempting to transmit packets to AP 1 after
    the attenuation of the RF signal
  • The Switch Over Time is the time it took the NIC
    to complete its association with AP2 after it
    stopped attempting to transmit packets to AP 1
  • During the Decision Time cards recognized the
    signal attenuation and invoked proprietary
    algorithms to adapt their rates to slower speeds.
  • Switch Over Time ends when the NIC receives the
    APs acknowledgement to its association request.
  • This time should only be recorded as valid if
    data traffic from the NIC to the AP successfully
    resumes transmission after association.

13
Test Results
14
(No Transcript)
15
Test Conclusions
  • Need to break out Decision Time and Switch over
    Time
  • Switch Over Times are as low as 60 milliseconds
    and averages a little over 700 milliseconds over
    all the combinations excepting two outliers which
    took over 8 seconds.
  • In the majority Decision Time is the largest
    contributor to the overall Roaming Times.
  • Packet traces show that most implementations of
    the rate adaptation algorithms maintain
    transmission at the lowest 1 Mbps rate for
    several seconds after loss of RF signal has been
    detected.
  • These algorithms will need to be revisited to
    deliver quality roaming.
  • Test standards for measuring roaming times can
    make a significant contribution by aligning all
    vendors and users on a common set of definitions
  • This applies to roaming but also to a large
    number of other undefined terms

16
Challenges of Operating an Enterprise WLAN
  • Don Berry
  • Senior Wireless Network Engineer
  • Microsoft Information Technology

17
Microsofts CurrentWLAN Network
  • 4495 Access Points
  • 1 AP per 3500 sq/ft
  • 15 Million sq/ft covered in 79 countries
  • 70,000 users
  • 500,000 802.1x authentications per day EAP-TLS
  • Supports 5.5 and 11Mbps only

18
Wireless Service Challenge
  • What is Wireless Service?
  • How is it measured?
  • What factors impact Wireless Service?
  • How do you improve Wireless Service?

19
Wireless Service and Support
  • Service Goals
  • Make Wireless service equivalent to wired
  • Offer unique mobile services
  • Operational Goals
  • Reduce operational costs
  • Minimize support personal

20
How can WPP Help?
  • Produce criteria that reflect the client
    experience
  • Offer information that can compare different
    environments Enterprise, SOHO, home

21
Desired Outcome of WPPA Perspective From a Si
Provider
  • Mike Wilhoyte
  • Texas Instruments

22
Key Issues We Face Relevant to WPP
  • Supporting Customers w/ Custom Test Fixtures
  • Are often uncontrolled and therefore
    repeatability is questionable
  • May introduce unintentional impairments and
    therefore dont effectively isolate the feature
    under test
  • May unintentionally push beyond the boundary of
    the specification
  • May stress the system beyond what any user or
    evaluator will do
  • May overlook other critical areas of system
    performance
  • The complexity of the specification has grown
    since the days of 11b and more than ever,
    performance is setup dependent
  • Are tests really apples-to-apples?

23
These Issues Result in
  • Confusion over unexpected test results
  • Resource drain

24
A Real Example Customer ACI Test Fixture
RF Shield Room non Anechoic
Over 30 Active APs
TCP/IP
People Observing the Results
Test Plot TCP/IP throughput with increasing
levels of interference from the SMIQ
25
Issues With This ACI Test Fixture
Can you imagine trying to repeat any test result
from this fixture in YOUR lab?
  • Metal walls in shield room producing multipath
    making the test results depend even more on the
    position of the laptop (in a fade?)
  • People (2.4 GHz RF absorbers) in the room
  • Over 30 APs active which may couple into the RF
    front-end (even though its cabled) of the test
    AP
  • SMIQ produces a non-realistic signal since the
    carrier is always present even though it may be
    triggered externally
  • There are ways around this
  • The test AP is not isolated from the interference
    and its behavior will affect the test result of
    the DUT
  • Rx performance in the same interference
  • Deferral behavior in the Tx (CCA) is affected
  • Rate adjustment behavior

26
A Better ACI Test Fixture Testing the STA
Channel 6
DUT
AP
Atten
clp
Anechoic Chamber 4
Anechoic Chamber 3
30 dB
PA
PA isolates interfering network and is not
affected by traffic in chambers 3,4
Attenuator
clp
in
out
20 dB
pad
Interfering network swept on channels 1-11
pad
30 dB
AP
STA
Anechoic Chamber 1
Anechoic Chamber 2
27
A Better ACI Test Fixture Testing a Network
(APSTA)
Channel 6
DUT
30 dB
Anechoic Chamber 3
PA
PA isolates interfering network and is not
affected by traffic in chamber 3
Attenuator
clp
in
out
20 dB
Interfering network swept on channels 1-11
30 dB
AP
STA
Anechoic Chamber 1
Anechoic Chamber 2
28
Desirable Outcomes of WPP
  • Develop a minimal set of test metrics that are
    relevant to key performance parameters such as
  • Robustness
  • Throughput/Capacity
  • Range
  • Develop a Set of Test Best Practices that
  • Produce repeatable results
  • Achieve the right balance between complexity and
    cost
  • The industry will use

29
UNH-IOL perspective on WLAN Performance testing
  • Kevin Karcz
  • March 15, 2004

30
A Quick Background
  • UNH-IOL Wireless Consortium primarily has focused
    on interoperability and conformance tests for
    802.11, not performance testing
  • Have generated traffic loads to observe a DUTs
    behavior under stress, but not specifically to
    measure throughput or related parameters.
  • However, QoS is inherently about providing
    performance while sharing limited resources
  • Optimization of Throughput , Range, Delay
    Jitter
  • Constrained by
  • User resources CPU load, DC power
  • Community resources Available spectrum,
    aggregate users

31
Examples of performance tests
  • PHY layer
  • Multi-path fading channel emulation using a
    Spirent SR5500 fader.
  • What fading models should be used?
  • MAC layer
  • Throughput varies with Traffic Generator used
  • CPU load differs significantly for between
    different vendors. Much greater than CPU load
    for a typical Ethernet device.

32
Clear methods of testing are needed
  • As we start measuring more performance metrics
  • Can each layer of the network be measured
    independently?
  • Which metrics need to look at the interaction of
    multiple layers?
  • Hassle of real world scenario testing vs. a PHY
    test mode?
  • Black box testing requires DUT to authenticate
    and associate with test equipment and interact at
    the MAC layer, not just the PHY layer.

33
Some gray areas of testing
  • Throughput
  • Is throughput measured as a MAC layer payload? At
    IP layer? TCP or UDP layer?
  • One DUT may have better PER measurements at the
    PHY layer than a 2nd DUT, but may get worse
    throughput if its rate selection algorithm is
    poor.
  • Difficult to maintain consistency in an open
    (uncontrolled) environment
  • Can throughput be measured in a cabled
    environment without an antenna?
  • What if the DUT has a phased array antenna?
  • What if the device is mini-PCI and inherently has
    no antenna?
  • Range test
  • What if a higher TX level causes higher adjacent
    channel interference and brings the aggregate
    throughput down for a neighboring BSS?
  • Power consumption
  • Is this just the DC power drain at the cardbus
    card interface?
  • Should CPU load be included if the DUT implements
    much of its MAC functionality on a host PC?
  • Roaming
  • Quickest time 1 STA, 2 APs on same channel
  • More realistic AP reboots, Multiple STAs roam to
    new AP on new Channel

34
Why WPP role is important to UNH-IOL?
  • Open standards are desired for the basis of test
    suite development
  • Defining test parameters and standardization of
    Test Scenarios makes comparison of Apples to
    Apples easier
  • IEEE focuses on the technical content
  • Our interest is the testing, not determining how
    results are utilized

35
Why WPP should define the tests
  • UNH-IOL follows IEEE PICS for test cases
  • More detailed info for test results
  • Cases PDA/laptop/AP weight test results
    differently

36
Example criteria weighting
Throughput Delay Jitter DC power Roaming time
Laptop
AP N/A N/A
VoIP phone 0
37
Comments on Wireless LAN Performance Testing And
Prediction
  • Jason A. Trachewsky
  • Broadcom Corporation
  • jat_at_broadcom.com

38
Topics
  • Test Categories for WPP
  • Some Test Configurations

39
Test Categories for WPP
  • Deciding what parameters are to be considered is
    the challenge.
  • How do we transform user perception of
    performance into a set of repeatably-measurable
    quantities?
  • Throughput and Range (what environments?)
  • Frame latency
  • Visibility of APs

40
Test Categories for WPP
  • How do we transform user perception of
    performance into a set of repeatably-measurable
    quantities?
  • Delays in association/authentication
  • Host CPU utilization
  • Ability to roam without loss of connections
  • Etc.

41
Test Categories for WPP
  • Basic PHY/RF Measurements
  • Transmitter Parameter Measurements
  • TX EVM or Frame Error Rate (FER) with
    Golden/Calibrated Receiver
  • Carrier suppression
  • Carrier frequency settling

42
Test Categories for WPP
  • Receiver Parameter Measurements
  • RX FER vs. input power
  • Flat channel (controlled through cabled/shielded
    environment)
  • Controlled frequency-selective channels (known
    multipath power-delay profile)
  • Antenna measurements
  • cable/feed losses (S11 and S21)
  • gain vs. azimuth and elevation angle
  • One can easily take a great receiver design and
    blow away all gains with a bad antenna or lossy
    feed!

43
Test Categories for WPP
  • MAC Layer Measurements
  • rate adjustment behavior
  • specific parameters? test conditions?
  • association and roaming behavior
  • specific parameters? test conditions?
  • frame latency
  • layer-2 throughput with encryption
  • host CPU cycles consumed?

44
Test Categories for WPP
  • Layer-4 Measurements
  • UDP frame loss rate and latency vs. received
    power
  • flat or frequency-selective channels?
  • TCP throughput vs. received power
  • flat or frequency-selective channels?

45
Test Categories for WPP
  • Open-air Measurements
  • Open-air measurements are always subject to
    imperfectly-known time-varying multipath
    power-delay profiles.
  • There is substantial variation at 2.4 and 5.5 GHz
    over 10s of msec.

46
Test Categories for WPP
  • We have established that frequency-selectivity
    due to multipath can result in higher-loss
    channels having higher capacity than lower-loss
    channels.
  • The capacity of the channel can vary rapidly.
  • This is a more significant problem for systems
    like 802.11a and 802.11g which include a large
    number of possible rates to better approach the
    practical capacity.
  • (The problem wont get easier for future WLAN
    standards.)

47
Test Categories for WPP
  • Open-air Measurements
  • What can we do?
  • Try to perform as many measurements as possible
    with cable networks.
  • Perform open-air measurements in an area in which
    the distance between transmitter and receiver is
    small compared with the distance between either
    transmitter or receiver and any other object.
    I.e., strong LOS.
  • Helpful but not sufficient, as even small
    reflections affect channel capacity.

48
Test Categories for WPP
  • Open-air Measurements
  • What can we do?
  • Give up and perform a large ensemble of
    measurements and gather statistics.

49
Channel Measurement Block Diagram
  • Scope provides 10Mhz reference clk for all
    systems
  • 3 long interconnect cables connect tx and rx side
  • Filt module includes LNA

50
Time- and Frequency-Selective Fading
gt 10-dB change in received signal power in some
bins over 60 msec.
51
Topics
  • Test Categories for WPP
  • Some Test Configurations

52
System Block Diagram
  • Main boardtest system
  • Transmit test system

53
Boardtest System
controller (attenmach)
DUT
mux-2
atten
REF
coupler
coupler
mux-3
SA
mux-6
power meter
54
RF Loop Block Diagram
Does WPP specify RF test fixtures?
Does WPP specify fading channel emulators (no!)
or a set of fading channel profiles (maybe!).
55
Multipath Channel 2
Example fixed multipath channel power-delay
profile.
56
Comments on Wireless LAN Performance Testing And
Prediction
  • Fanny Mlinarsky
  • Azimuth Systems
  • fanny_mlinarsky_at_azimuth.net

57
Ethernet vs. WiFi
Greater protocol complexity more test metrics
Wired test metrics RFC 2285,2544,2889
Wireless test metrics
58
Test Metrics
Packet forwarding
Security
Roaming
Behavioral
QoS
Rate adaptation
Encryption
WEP TKIP AES
Authentication
Offered Load
Packet Size
of Clients
of Power Save Clients
Load Of Assoc/ De-assoc/ Re-assoc
RTS/CTS
Fragmentation
EAP-TLS EAP-TTLS PEAP LEAP
59
Forwarding Rate Measurement
Open air
Controlled RF
60
Controlled Test Environment
  • If measurements are not repeatable then the test
    is invalid
  • Open air creates unpredictable test conditions
    due to interference and multipath
  • Shielded and cabled test environment may be
    necessary for some measurements

61
Summary
  • The IT managers question How well do mobile
    computing solutions perform in the enterprise?
  • The dilemma Standard wired benchmarking
    techniques wont give the right answer
  • Verifying function and performance in the
    inherently unstable wireless space calls for a
    new methods metrics
  • The answer New methods to test and measure every
    aspect of wireless protocols
  • Wireless metrics outnumber traditional Ethernet
    metrics 51
Write a Comment
User Comments (0)
About PowerShow.com