standardization of field performance measurement …€¦ · standardization of field performance...
TRANSCRIPT
1
Standardization of Field Performance Measurement Methods for Product
AcceptanceGreg Twitty
R & D Project ManagerProduct Test Factory
Nokia Mobile Phones
2
Overview
• Current state of product acceptance testing
• Proposal 1: Reduce high duplication of CDG3.
• Proposal 2: Standardize call performance methods.
• Summary
• Questions
3
Significant portion of schedule.Significant portion of schedule.•• Approval cycle is significantly longer for Approval cycle is significantly longer for
CDMA than other protocols.CDMA than other protocols.•• Impacts product cost.Impacts product cost.•• Reduces competitiveness of CDMA.Reduces competitiveness of CDMA.
Product Acceptance
(CDG stage 3)
Field Testing for Product Acceptance
Development Testing• Test with internal methods.• Verify compliance to standards.
R&DTesting
Prep for Lab Entry
Field Test Activities
Design Concept
Development timeline
Customer Specific Testing1. Pre-run CDG stage 3.2. Demonstration in front of
each Carrier
4
Challenges Facing CDG Stage 3
• Time consuming and costly approval process.• Significant resource and travel costs.
• CDG64 test cases are loosely defined.• Each carrier has its own procedures.
• Not balanced for best error detection.• Each CDG3 has duplicated tests with CDG2 and other CDG3s.• Lack of standardized test methods and criteria impede data sharing.
• Unbalanced test reliability in call performance tests.• Results are often not repeatable in weak signal areas.• False failures often detected. Analysis is very time consuming.• Excessive testing in strong signal areas.
Reduce Product Acceptance time and costTest where the errors are most likely.Improve reliability of results.
5
Standardization of CDG3 Test Methods
Proposal 1:
Reduce high test duplication of CDG stage 3. • Phone software is very mature in many areas of feature test. • Use data from vendor, other CDG3s and CDG2 to replace tests in “generic”
areas.
Proposal 2:
Standardize call performance methods and criteria.• Improve test accuracy in weak signal areas,• Reduce unnecessary testing in strong signal routes.
6
Proposal 1:
Reduce High Test Duplication of CDG Stage 3
7
Return on Investment Over Time• Infra deployments of a common vendor differ little between carriers.• Number of errors found drops off significantly after first CDG3.• CDG3 test time required by carriers continues as product matures.
R&DTesting
Pre LabEntry
1st CarrierApproval
2nd CarrierApproval
Total Testing on
Infra X
Errors found
Time3rd CarrierApproval
8
33--9 Test Cases9 Test Cases70% of Test Effort70% of Test Effort
30% Errors reported30% Errors reported
37 Test Cases37 Test Cases30% of Test Effort30% of Test Effort
20% Errors reported20% Errors reported
8 Test Cases8 Test Cases30% of Effort30% of Effort50% Errors 50% Errors
reportedreported
TerminationsOriginationsMaintenance
Strong SignalMixed SignalWeak Signal
Mobile Tests StaticTests
Other- SMS- MMS- Browser- Java- Brew
N/A
Provisioning- OTASP- OTAPA- IOTA
System Acquisition
CDG Stage 3 Breakdown
Call Performance Feature Testing
4.1
4.2.1, 4.2.2, 4.3
4.8
4.8
4.6, 4.7
4.104.11
CDG 64 Test Cases
Mobile
2G DataHSPD
Location Determination
Call Types- POTS- 3-way- Call Waiting- Voice Mail- Authentication
4.5, 4.9
9
Tested in Good Signal conditions
High Duplication
Tested in Weak Signal conditions
High Variability
Tested in Good Signal conditions
High Duplication
Mixed SignalWeak Signal
Mobile Tests
RF PerformanceRx- Sensitivity- IMD- Self jamming
TX - Rho- Power
control- Max Power
BB & DSP- Signal
Acquisition- SHO- Searcher- Finger
assignment
Other- SMS- MMS- Browser- Java- Brew
System Acquisition
CDG Stage 3 Breakdown
StaticTests
Areas are tested typically in “generic” conditions.These test have commonality with CDG2 tests and other CDG3s.
2G Data HSD
Call Types- POTS- 3-way- Call Waiting- Voice Mail
Provisioning- OTASP- OTAPA
Feature Testing
Interoperability w/ Infra
Handoffs- SHO- Interband HHO- Interfreq HHO
Messaging- Layer 1- Layer 2
Channels- ACH- DPCH- FTCH- RTCH
Strong SignalTerminationsOriginationsMaintenance
Call Performance
Location Determination
10
Reduce High Test Duplication of CDG Stage 3
Proposal: Identify & utilize common test results.
1. Carriers and infra vendors should define clear CDG3 test procedures that apply to “generic” network conditions. (common to all Infra deployments)
2. Tests that are duplicated by CDG2 should also be identified.• Call Types
3. Test results from CDG2 and previous CDG3 should be considered during product acceptance.
• Software changes made between test events should be evaluated.
4. Carrier and vendor continue to perform tests unique to carrier’s network.
•Provisioning•Location Determination (non CDG3)
•Strong signal drive routes•2G Data / HSD in static conditions.
•Messaging•Browser, content downloads, etc.
•System Determination•Mobile IP / HSPD
11
Proposal 2:
Standardize call performance methods and criteria.
12
Issues with Call Performance Tests
• Weak Signal Routes: test results have high variation and are notreliable. Outcome is dependant on time of day.
• Misleading results cause unnecessary search for errors and retesting• Significant pre-testing needed by vendor to measure lab entry readiness.
Reducing variability of results will decrease acceptance time and cost.
• Strong Signal Routes: constitutes mostly “generic” conditions• Test variance is very low. Results are repeatable.• Few errors are detected in this test.
Can reduce test length for quicker acceptance time.- or – accept data from other CDG3s using same infra -> Proposal 1
Strong Signal
Mixed Signal
Weak Signal
1-12 cities
x2 Infras
1-2hr Long Call
.75 – 2 drops/hr2% of ref phone
100-200 Originations100-200 Terminations
95% absolute2% of ref. phone
Carriers institute a wide range of test cases and criteria.
13
Mixture of Test Conditions
Zone 2
Zone 1
Competitor or AMPS
Zone boundary: can miss a page when registration timer
is active.
System Determination• Too many variables.
• Hard to distinguish between other weak signal failures
Pilot Pollution: Severity is affected by cell breathing.
1. Test special conditions separate from call performance.
2. Measure/control non deterministic factors where possible.
Coverage fringe area
Loaded Sector: Calls can be blocked during
busy hour.
14
0
2
4
6
8
10
12
14
Faile
d Te
rmin
atio
ns p
er 1
00 A
ttem
pt
ReferencePhone A
ReferencePhone B
Example of Weak Signal Variability
• Reference phone is intended to track network conditions.• For each test, there is no correlation between the two phones.• Correlation is apparent after 700 calls
Day 1 Day 2 Day 3 Day 4 Day 5
•Captured on weak signal route
•Same make & model•Same software
Indicates 2% passing criteria.
Delta: +1 0 -7 -2 -1 +2 +7 Average = 0
Comparison of Two Reference Phones
15
µ =9.4
σ2 = 8.3
Probability of Failure • Creating a histogram from previous slide’s data, a probability
distribution can be seen.• Shape, mean (µ), and variance (σ2) of the distribution are determined
by many factors: • RF conditions at test time• Test sample size (# call attempts)• Phone performance
Pro
babi
lity
Failures after 100 Attempts
20
Weak signal drive route
5 10 15
Histogram using data from previous slide.
3
2
1
16
Call Performance Criteria• Variance determines effectiveness of the call performance test.• For fixed criteria, large variance has a higher chance of failing a good
phone. Results are less reliable and repeatable.• Hence, cell breathing and sample size effect test reliability.• Observation: 100 call attempts has a large variance for weak signal routes.
Testphone
High Variance Low Variance
205 10 150
2% margin
Chance of failing good
phone.
Chance of failing good
phone.
2% marginFailed Calls after 100 Attempts
Referencephone
17
Standardize Call PerformanceProposal: Drive test should be created with a target accuracy in mind.
Need agreed method to measure test’s repeatability.
1. Drive routes should focus on specific signal profile. Other test cases should be executed in a separate test. I.e.: weak or strong signal. No mixed signal routes.
2. Standard guidelines for determining test accuracy / repeatability are needed. Methods should have a statistical basis.
3. Standard methods for determining sample size and pass criteria are needed. • This should include measuring network conditions at time of test.
….
….
5095% abs.
-11 dB-70 dBm
….2002% of ref.
2504% of ref.
3004% of ref.
Sample sizeCriteria
….….
-11 dB-90 dBm
-13 dB-90 dBm
-15 dB-90 dBm
Ec/IoRSSI
EXA
MPL
E
Adaptive Pass Criteria•Create an index of signal conditions (Ec/Io & RSSI) vs. reference performance.•Test criteria should then be defined for each index based on a standard.•Ec/Io and traffic loading should be measured at test time.
18
Test Methods to Standardize• Different RF channels have different quality of service.
• Different channels will have different traffic loading.• BTS antennas, combiners and RF equipment vary over frequency.Carrier and vendor ensure phones are provisioned to same channels.
• Inconsistent phone handling in test van.• Affects antenna performanceUse phone cradle. Rotate positions each lap.
• Controlled call cycle time.• Must ensure phone is in idle state between call attempts.All steps in test should be defined and standardized.
• Laptop PC in test van can induce interference.Test laptop in advance. Rules needed for proximity to test phones.
Common test methods are needed to ensure control over sources of variability.
19
Summary
Proposal 1:
Reduce high test duplication of CDG stage 3. • Significant areas of test duplication exist.• Reduction in test time can be achieved by intelligent reuse of test results
from previous CDG3 and CDG2.
Proposal 2:
Standardize call performance methods and criteria.• More repeatable results can be achieved by controlling sources of
variance and defining pass criteria.• Similar analysis can be applied to strong signal tests to reduce test time.• Standard test methods in the vehicle are needed to further reduce non
deterministic events.
Product Acceptance time and associated costs can be reduced by a unified effort of the CDMA community.
20
Questions