Competitive Network Benchmarking Sample Report Albuquerque, NM Issued: August 2005
Competitive Network Benchmarking Sample Report - Albuquerque
2005 EnVision Wireless, Inc. - i - 8/26/2005 All Rights Reserved
Table of Contents Executive Summary............................................................ 1
Network Comparison Results ................................................. 3
Client Network Results ........................................................ 6
Test Parameters................................................................ 9
Benchmarking Methodology .................................................10
Call Processing ...........................................................11 Network Ranking.........................................................12
Quality Assurance.............................................................13
Project Planning .........................................................13 In-Market Setup..........................................................14 Drive Test Clearance....................................................14 Data Collection ..........................................................15 Data Post Processing ....................................................16 Report Review ...........................................................17 Report Production .......................................................17 Shipment and Data Archival............................................17
Appendix .......................................................................18
Index of Tables Table 1: Network Call Performance Results................................ 1
Table 2: Summary Data ....................................................... 3
Table 3: Received Signal Strength Cumulative Distribution.............. 3
Table 4: CDMA FER Cumulative Distribution ............................... 3
Table 5: Client Dropped Call Analysis ....................................... 8
Table 6: Test Parameters ..................................................... 9
Index of Figures
Figure 1: Composite Network Rankings ..................................... 2
Figure 2: System Access Times (seconds)................................... 4
Figure 3: Received Signal Strength (dBm) .................................. 4
Figure 4: CDMA FER (%)........................................................ 4
Figure 5: GSM RxQual.......................................................... 5
Figure 6: iDEN SQE (dB) ....................................................... 5
Figure 7: CDMA Aggregate Ec/Io (dB) ....................................... 5
Figure 8: Client Received Signal Strength (dBm) .......................... 6
Figure 9: Client Mobile Transmit Power (dB)............................... 6
Figure 10: Client Forward FER (%) ........................................... 7
Figure 11: Client Aggregate Ec/Io (dB) ..................................... 7
Competitive Network Benchmarking Sample Report - Albuquerque
2005 EnVision Wireless, Inc. - ii - 8/26/2005 All Rights Reserved
Figure 12: Client Active Set PNs ............................................. 7
Figure 13: System Setup Photograph One .................................10
Figure 14: System Setup Photograph Two .................................10
Figure 15: Agilent E6474A Call Processing Flow ..........................11
Competitive Network Benchmarking Sample Report - Albuquerque
2005 EnVision Wireless, Inc. - 1 - 8/26/2005 All Rights Reserved
Executive Summary
EnVision Wireless was contracted by Client to conduct a competitive analysis of commercial wireless network performance in the Albuquerque market in August of 2005. This report represents a sample of the results for that survey. EnVision uses the Agilent E6474A measurement system in the execution of these benchmarking surveys. This equipment allows for an accurate, repeatable network evaluation from a perspective that duplicates the discriminating opinion of the subscriber. Results collected in this method are independent of technology and are useful for comparison at the local, regional and national levels. Carrier network performance is ranked based on accessibility (no-service and blocked calls) and retainability (dropped calls). In addition to call performance EnVision also measures and compares the following engineering metrics by operator: 4 System access time 4 Received signal strength 4 Signal quality (FER, RXQual or SQE) 4 Aggregate Ec/Io The following table summarizes the overall network call performance for the Albuquerque survey. For detailed results, please refer to the “Network Comparison Results” section of this report.
Table 1: Network
Call Performance
Results
Car
rier
Acc
ess
Att
empt
s
Acc
essi
bilit
y (%
)
No
Serv
ice
Cal
ls
Bloc
ked
Cal
ls
Acc
essi
bilit
y R
ank
Ret
aina
bilit
y (%
)
Dro
pped
Cal
ls
Ret
aina
bilit
y R
ank
Com
posi
te (
%)
Com
posi
te R
ank
Nextel 523 99.43% 0 3 1 99.62% 2 1 99.52% 1
Cingular 498 95.58% 0 22 6 99.58% 2 2 97.58% 6
T-Mobile 468 98.29% 3 5 2 98.91% 5 4 98.60% 2
Cricket 493 97.97% 1 9 3 98.34% 8 5 98.16% 4
Sprint 485 97.11% 2 12 5 99.58% 2 3 98.34% 3
Verizon 519 97.50% 2 11 4 97.83% 11 6 97.66% 5 Utilizing the network ranking methodology outlined later in this
report, the networks are ranked as follows:
Competitive Network Benchmarking Sample Report - Albuquerque
2005 EnVision Wireless, Inc. - 2 - 8/26/2005 All Rights Reserved
Figure 1: Composite
Network Rankings
95.58%
97.50%
97.97%
97.11%
98.29%
99.43%
99.58%
97.83%
98.34%
99.58%
98.91%
99.62%
97.58%
97.66%
98.16%
98.34%
98.60%
99.52%
0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00%
Cingular Wireless GSM
Verizon CDMA
Cricket CDMA
Sprint CDMA
T-Mobile GSM
Nextel iDEN
Accessibility (%) Retainability (%) Composite (%)
Based on the two (2) comparative measurement criteria of accessibility (no service and blocked call incidents) and retainability (dropped calls), the rankings show Nextel as the best carrier for this study.
Competitive Network Benchmarking Sample Report - Albuquerque
2005 EnVision Wireless, Inc. - 3 - 8/26/2005 All Rights Reserved
Network Comparison Results
This section contains the overall high-level findings of the survey conducted for the Albuquerque market.
Table 2: Summary
Data
Para
met
er
Nex
tel
Cin
gula
r
T-M
obile
Cri
cket
Spri
nt
Ver
izon
Call Attempts 523 498 468 493 485 519
Accessibility (%) 99.43% 95.58% 98.29% 97.97% 97.11% 97.50%
Retainability (%) 99.62% 99.58% 98.91% 98.34% 99.58% 97.83%
Composite (%) 99.52% 97.58% 98.60% 98.16% 98.34% 97.66%
Call Setup (sec) 0.14 4.99 4.75 2.40 1.99 2.00
RSL (dBm) -67.47 -77.90 -75.37 -73.06 -69.73 -74.30
FER (%) - - - 0.30 1.35 0.46
Agg. Ec/Io (dB) - - - -7.15 -7.01 -7.45
RxQual - 1.31 1.72 - - -
SQE (dB) 30.99 - - - - -
Table 3: Received
Signal Strength
Cumulative Distribution
RSL
(dB
m)
Nex
tel
Cin
gula
r
T-M
obile
Cri
cket
Spri
nt
Ver
izon
<-104 0.0% 0.1% 0.0% 9.0% 0.2% 0.5%
< -96 0.1% 3.0% 0.7% 21.0% 0.4% 0.7%
< -86 4.2% 24.6% 22.2% 37.1% 4.8% 8.0%
< -75 26.9% 70.4% 62.7% 65.0% 32.3% 50.8%
< -30 100.0% 100.0% 99.9% 99.7% 97.0% 99.0%
Table 4: CDMA FER
Cumulative Distribution
FER (%) Cricket Sprint Verizon
< 1 91.8% 39.7% 87.4%
< 2 97.0% 73.9% 94.6%
< 3 98.1% 97.3% 96.7%
< 100 100.0% 100.0% 100.0%
Competitive Network Benchmarking Sample Report - Albuquerque
2005 EnVision Wireless, Inc. - 4 - 8/26/2005 All Rights Reserved
Figure 2: System
Access Times (seconds)
Call Setup Time Comparison for Albuquerque
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Time in Seconds
Per
cen
tag
e
Cricket CDMA
Sprint CDMA
Verizon CDMA
Cingular Wireless GSM
T-Mobile GSM
Nextel iDEN
Figure 3: Received
Signal Strength
(dBm)
RSL Comparison for Albuquerque
0%
5%
10%
15%
20%
25%
30%
0 -5 -10 -15 -20 -25 -30 -35 -40 -45 -50 -55 -60 -65 -70 -75 -80 -85 -90 -95 -100 -105 -110 -115 -120
Rx Power (dBm)
Per
cen
tag
e
Cricket CDMA
Sprint CDMA
Verizon CDMA
Cingular Wireless GSM
T-Mobile GSM
Nextel iDEN
Figure 4: CDMA FER
(%)
CDMA FER Comparison for Albuquerque
0%
5%
10%
15%
20%
25%
30%
35%
40%
45%
50%
0
0.5 1
1.5 2
2.5 3
3.5 4
4.5 5
5.5 6
6.5 7
7.5 8
8.5 9
9.5 10
FER
Per
cen
tag
e
Cricket CDMA
Sprint CDMA
Verizon CDMA
Mean = 0.30Standard Deviation = 1.06
Mean = 1.35Standard Deviation = 1.47
Mean = 0.46Standard Deviation = 1.34
Competitive Network Benchmarking Sample Report - Albuquerque
2005 EnVision Wireless, Inc. - 5 - 8/26/2005 All Rights Reserved
Figure 5: GSM RxQual
GSM RxQual Comparison for Albuquerque
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
0 1 2 3 4 5 6 7
RxQual
Per
cen
tag
e
Cingular Wireless GSM
T-Mobile GSM
Mean = 1.31Standard Deviation = 2.04
Mean = 1.72Standard Deviation = 2.30
Figure 6: iDEN SQE
(dB)
IDEN Svr SQE for Albuquerque
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40
Svr SQE (dB)
Per
cen
tag
e
Nextel iDEN
Mean = 30.99Standard Deviation = 3.23
Figure 7: CDMA
Aggregate Ec/Io (dB)
CDMA Aggregate Ec/Io Comparison for Albuquerque
0%
10%
20%
30%
40%
50%
0 -1 -2 -3 -4 -5 -6 -7 -8 -9 -10
-11
-12
-13
-14
-15
-16
-17
-18
-19
-20
-21
-22
-23
-24
-25
Aggregate Ec/Io (dB)
Per
cen
tag
e
Cricket CDMA
Sprint CDMA
Verizon CDMA
Mean = -7.15Standard Deviation = 3.58
Mean = -7.01Standard Deviation = 3.81
Mean = -7.45Standard Deviation = 3.88
Competitive Network Benchmarking Sample Report - Albuquerque
2005 EnVision Wireless, Inc. - 6 - 8/26/2005 All Rights Reserved
Client Network Results
This section provides a more detailed analysis of the Client’s network performance in Albuquerque.
Figure 8:
Client Received
Signal Strength
(dBm)
Received Signal Strength (dBm) for Client - Albuquerque
0
500
1000
1500
2000
2500
3000
3500
4000
4500
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Frequency 15 119 1326 4167 4242 2016 596 78
Cumulative % .12% 1.06% 11.59% 44.67% 78.34% 94.34% 99.07% 99.69%
-105 -95 -85 -75 -65 -55 -45 -30
Figure 9: Client Mobile
Transmit Power (dB)
Mobile Transmit Power (dBm) for Client - Albuquerque
0
1000
2000
3000
4000
5000
6000
7000
8000
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Frequency 1049 1755 7268 2217 283
Cumulative % 8.33% 22.26% 79.95% 97.55% 99.79%
-25 -15 0 15 25
Competitive Network Benchmarking Sample Report - Albuquerque
2005 EnVision Wireless, Inc. - 7 - 8/26/2005 All Rights Reserved
Figure 10: Client
Forward FER (%)
Forward FER (%) for Client - Albuquerque
0
1000
2000
3000
4000
5000
6000
7000
8000
9000
10000
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Frequency 8644 1934 987 374 286 84 50
Cumulative % 68.61% 83.97% 91.80% 94.77% 97.04% 97.71% 98.10%
0 0.5 1 1.5 2 2.5 3
Figure 11: Client
Aggregate Ec/Io (dB)
Aggregate Ec/Io (dB) for Client - Albuquerque
0
500
1000
1500
2000
2500
3000
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Frequency 1965 671 849 1421 1803 2390 1938 1182
Cumulative % 15.60% 20.92% 27.66% 38.94% 53.25% 72.23% 87.61% 96.99%
-10 -9 -8 -7 -6 -5 -4 -3
Figure 12: Client Active
Set PNs
Number Active PNs for Client - Albuquerque
0
1000
2000
3000
4000
5000
6000
7000
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Frequency 6575 3656 1897 470
Cumulative % 52.19% 81.21% 96.27% 100.00%
1 2 3 4
Competitive Network Benchmarking Sample Report - Albuquerque
2005 EnVision Wireless, Inc. - 8 - 8/26/2005 All Rights Reserved
Table 5: Client
Dropped Call Analysis
Lati
tude
Long
itud
e
Tim
e
RX P
ower
(dB
m)
TX P
ower
(dB
m)
Fwd
FER
(%)
Cur
rent
Cha
nnel
Stro
nges
t PN
Stro
nges
t Ec
/Io
(dB)
35.1309 -106.592 08/10 19:10:07.00 -68.9 15.4 92.8 925 405 -24.6
35.0356 -106.629 08/11 19:38:50.00 -72.5 -1.9 45.8 925 45 -19.5
35.1818 -106.688 08/13 09:23:08.00 -105.9 13.2 33.7 925 51 -11.8
35.1726 -106.496 08/13 09:48:17.00 -105.5 16.4 57.1 925 417 -21.6
35.1499 -106.663 08/13 14:47:26.00 -83.5 23.0 53.6 925 417 -24.6
35.141 -106.667 08/13 14:48:45.00 -91.5 29.8 27.2 925 417 -21.5
35.0914 -106.635 08/13 15:11:11.00 -78.5 17.5 42.8 925 114 -17.6
35.1851 -106.606 08/15 19:42:56.00 -73.2 12.4 82.1 925 45 -19.8
Competitive Network Benchmarking Sample Report - Albuquerque
2005 EnVision Wireless, Inc. - 9 - 8/26/2005 All Rights Reserved
Test Parameters This section provides a summary of the test parameters used during the project.
Table 6:
Test Parameters
Test Parameter Value
Networks surveyed
Nextel – iDEN (321-863-2784) Cricket – CDMA (321-263-8175) Sprint – CDMA (321-917-0556) T-Mobile – GSM (321-960-4281) Verizon – CDMA (321-446-3811) Cingular – GSM (206-992-1843)
Hardware used Agilent E6474A (Version 8.1) Agilent E6473B (manifold) Agilent E645B PN Scanner
Handsets used
Nextel – Motorola i205 Cricket – Kyocera Slider Sprint – Samsung A500 T-Mobile – Nokia 6230 Verizon – Kyocera KX2 Cingular – Nokia 6230
Software used
Agilent E6474A 8.1 WSE DataQuick Microsoft MapPoint Microsoft Excel Microsoft Word
Miles surveyed Approximately 750
Drive route selection Provided by EnVision
Data collection window 7:00 a.m. – 7:00 p.m. daily
Call profile Mobile originated calls 90 second call duration 30 second idle time between calls
Location of call termination server EnVision office in Melbourne, FL
Drive test vehicle Standard mini-van
Handset separation Minimum of 18 inches
Handset location Cradles near headrests
Antenna location External – PN Scanner and GPS Internal – All handsets
Dates of survey August 2005
Competitive Network Benchmarking Sample Report - Albuquerque
2005 EnVision Wireless, Inc. - 10 - 8/26/2005 All Rights Reserved
Benchmarking Methodology
This section provides a description of the call-processing algorithm used by Agilent’s data collection system as well as photographs of the equipment setup, definitions of terms used throughout this report and a description of the network ranking methodology.
The following two photographs show the vehicle/equipment configuration used during the survey.
Figure 13:
System Setup
Photograph One
Figure 14: System
Setup Photograph
Two
Competitive Network Benchmarking Sample Report - Albuquerque
2005 EnVision Wireless, Inc. - 11 - 8/26/2005 All Rights Reserved
Call Processing In the Agilent E6474A system, calls are initiated from the mobile to the fixed call termination server for each network/technology under survey. Calls are classified by the Agilent system as follows: 4 NO SERVICE – If the phone is out of the service area, the
call is classified as no service. In other words, the call is not completed due to a lack of synchronization on the control channel.
4 BLOCKED – If the mobile is denied access to the system and is unable to establish communication with the call termination server, the call is classified as a blocked call. A blocked call attempt has no voice channel assigned within the call setup interval (typically 20 seconds). This includes calls being disconnected at the PSTN.
4 DROPPED – If the call ends prematurely after a voice channel has been assigned, the call is classified as a dropped call.
4 COMPLETED – If the call ends correctly at the end of the specified call duration it is classified as a completed call.
During post processing, EnVision groups the No-Service and Blocked call classifications into a final classification of Access Failures. A graphical depiction of the call processing algorithm used by the Agilent system is shown in Figure 15.
Figure 15:
Agilent E6474A Call Processing
Flow
Competitive Network Benchmarking Sample Report - Albuquerque
2005 EnVision Wireless, Inc. - 12 - 8/26/2005 All Rights Reserved
Network Ranking Network performance is evaluated from the subscriber’s perspective. A subscriber is most concerned with coverage (Access Failures) and performance (Dropped Calls) of the network. EnVision uses the following two criteria when ranking network performance: 4 Accessibility – Representative of a subscriber’s ability to
access the system on demand. The Accessibility metric is calculated as follows: Accessibility = (1 – Access Failure %)
4 Retainability – Representative of a subscriber’s ability to maintain a call for the desired duration. The Retainability metric is calculated as follows: Retainability = (1 – Dropped Call %)
Equal weighting is given to both Accessibility and Retainability. The network with the lowest combination of access failures and dropped calls is therefore ranked as the best network for the market.
Competitive Network Benchmarking Sample Report - Albuquerque
2005 EnVision Wireless, Inc. - 13 - 8/26/2005 All Rights Reserved
Quality Assurance This section describes the standard operating procedure used by EnVision to ensure delivery of high-quality competitive network benchmarking surveys.
Project Planning The first step in EnVision’s quality assurance process is the creation of the project plan. While a project’s scope is typically detailed during the sales process, several actions must take place to ensure a smooth handoff between the customers, EnVision’s sales personnel and EnVision’s project management team. In order to ensure the highest level of customer satisfaction, EnVision and its client must agree on a project’s test plan prior to the start of work. A typical test plan will outline the following parameters: 4 Market(s) to be measured 4 Equipment to be used 4 Call duration 4 Idle time between calls 4 Call placement window 4 Call type (mobile to land, land to mobile or both) 4 Operators to be measured 4 Drive route requirements 4 Proprietary or non-proprietary results 4 Parameters to collect 4 Location of Fixed Voice Quality Module 4 EnVision project manager contact information 4 EnVision field engineer contact information 4 Client main point of contact information 4 Project schedule requirements Scheduling for a project is completed based on the availability of equipment and personnel as well as client requirements. All personnel assigned to a project are made aware of which specific equipment will be used for the project as well as the project’s test plan requirements. Spare parts are provided to the field engineer based on the project’s requirements. Having these spare parts in the field reduces downtime when failures occur. Typical spare part allocations will include phone cables, batteries, antennas and phones. Should a failure occur in the field for which a spare part is not on hand, EnVision’s support engineer will arrange for replacement parts to be shipped to the field. The equipment is configured based on the requirements of the test plan. Scanners are installed based on the mix of technologies required. All equipment is thoroughly tested and inspected prior to shipment to the field.
Competitive Network Benchmarking Sample Report - Albuquerque
2005 EnVision Wireless, Inc. - 14 - 8/26/2005 All Rights Reserved
In-Market Setup Once the field engineer and the equipment have arrived in the market, the in-market setup phase can commence. During this phase, the field engineer will register all handsets on the service provider networks to be tested, place manual calls with all phones, setup and program equipment and verify equipment operability. EnVision’s field engineer will verify that each phone can successfully place and receive calls manually. In addition, a manual call will be placed to the VQM with each handset in order to assure proper operation of the VQM. If a handset cannot place or receive calls, EnVision’s field engineer will arrange for a replacement from the service provider. EnVision’s field engineer will physically connect all handsets to the equipment and configure the controlling software. The setup of the data collection software is a critical step in the overall quality assurance process. The field engineer will create a Test Plan File in the collection software that specifies the exact measurements to be collected (log mask) along with the call profile information. EnVision’s field engineer will verify that the test equipment is operating correctly, that GPS is configured properly, and that all phones are placing calls as specified in the data collection software.
Drive Test Clearance Both the field engineer and the support engineer complete the Drive Test Clearance phase. The primary purpose of this step is to obtain authorization to proceed with the data collection. Authorization is given to proceed only upon agreement between the field engineer and the support engineer that all equipment is operating correctly, the parameters of the test plan have been programmed and that all required data is being collected by the equipment.
The field engineer will complete a test file by placing calls on all handsets while the vehicle is stationary as well as while the vehicle is moving along a pre-defined test route. Stationary testing is completed in a location where all networks have sufficient coverage. The drive route used for the mobile testing is defined in a manner to be representative of the entire survey area. Once a sufficient number of calls have been placed on all handsets, the test file is terminated and forwarded to the support engineer assigned to the survey.
The support engineer will examine the test file to verify that all handsets are operating correctly, navigational data is being logged, the call profile has been setup correctly and that all log masks have been setup correctly. Any anomalies are communicated immediately to the field engineer.
Competitive Network Benchmarking Sample Report - Albuquerque
2005 EnVision Wireless, Inc. - 15 - 8/26/2005 All Rights Reserved
Once the field engineer and the support engineer agree that all equipment is operating according to the test plan, clearance to proceed with data collection is given.
Data Collection The Data Collection phase is a daily feedback loop between the field engineer and the support engineer. After each day of data collection, the support engineer reviews the data to make sure that all equipment is operating correctly. This process ensures that, in the event of an anomaly, a maximum of one day’s data is contaminated or lost. Each day prior to the start of data collection, the field engineer will physically inspect all handsets, cables, antennas, batteries and connections. In the event that a defective part is observed, the field engineer will replace the part using the available inventory of spares. Lastly, software parameters are verified to be consistent with the test plan. During data collection, the field engineer continuously inspects the system for anomalies. The field engineer is trained to observe the call statistics in real time for abnormally high rates of call failures that could be indicative of equipment failure. The field engineer will also monitor various engineering parameters for each technology so correlations can be drawn between coverage and interference issues in the networks being tested. Finally, the engineer will monitor the navigation system to ensure that positional information is being logged correctly. For each data file that is recorded, the field engineer will prepare notes that will be used by the support engineer and the post processing team during the analysis of the collected data. The specific information recorded includes file names, start and end times, start and end mileage, consecutive call failures, bad files, equipment failures and other anomalies. Upon completion of each day’s collection activities, the field engineer will analyze the data collected to verify data integrity. All files from the day are merged and call statistics produced. The field engineer is trained on data ranges to expect for various technologies. In addition, the field engineer will verify that all files contain valid navigational information. Upon completion of this step, the field engineer will forward the data to the support engineer for further analysis.
Competitive Network Benchmarking Sample Report - Albuquerque
2005 EnVision Wireless, Inc. - 16 - 8/26/2005 All Rights Reserved
Each day the support engineer will review the data collected on the previous day. The support engineer will examine each file while referring to the notes prepared by the field engineer. Each call failure that is recorded is examined in detail to determine its validity. Various graphs are created for the call failures to determine additional anomalies. The post-processing engineer marks all data that is determined to be anomalous for further review. The support engineer will contact the field engineer upon completion of the data review to communicate any issues identified. A copy of each day’s call statistics is provided to the customer. Upon completion of all drive testing, the field engineer will record a test file. The purpose of this file is to verify that the equipment and the handsets display similar performance at both the beginning and the end of testing. Upon completion of this test file, the field engineer forwards the file to the support engineer for review. Upon confirmation from the support engineer that all data has been collected properly, the field engineer will disassemble the test equipment and prepare for delivery to the next market.
Data Post Processing The Data Post Processing stage begins upon completion of all drive testing. This phase of the project results in a draft version of the survey’s deliverable report. Each day’s data is examined separately along with the test engineers notes. All files are converted and mapped. Preliminary graphs and tables are created to show the results of call statistics and call failures. Any problems identified as equipment failure are removed from the data files before further processing. Once the data has been cleaned using the data collector’s notes, all files for the survey are merged into one master file containing all sets of data for all carriers measured. This master file allows for more efficient processing. Call statistics are then tabulated and compared to historical data to determine macro level data concerns. Abnormally high percentages of call failures tend to indicate equipment failure. If abnormally high percentages of call failures are determined, further analysis is completed to determine the cause. Distribution charts are generated for engineering parameters from similar technology service providers. In addition, tables of call statistics are created to display the overall performance of each network. This information is used to create the final network rankings for the survey. Once all charts, graphs, tables and geographic plots have been created, they are compiled to produce a preliminary draft of the survey report.
Competitive Network Benchmarking Sample Report - Albuquerque
2005 EnVision Wireless, Inc. - 17 - 8/26/2005 All Rights Reserved
Report Review The draft report created in the Post Processing step is reviewed for quality assurance. All unresolved anomalies are researched to conclusion and documented in the report. The output of this step is the release of the report to the production team. EnVision’s lead engineers review all reports to validate all engineering observations, check formatting, verify reported results and identify any remaining data concerns. Any changes necessary are made and a final review draft is created. EnVision’s lead engineers will review the final draft to verify all changes have been made. Upon verification, the report is released to production.
Report Production All customer deliverables are prepared in quantity as required according to the terms of the contract. The final report is burned to CD in Adobe .pdf format along with all geographic plots. Hyperlinks are provided in the .pdf for ease in report review. All reports are printed and bound professionally. Printed copies are spot checked for printing errors.
Shipment and Data Archival The final step in EnVision’s Quality Assurance Process is the Shipment and Data Archival stage. All deliverables are packaged and shipped according to the instructions provided in the customer’s purchase order. An internal copy of the CD and deliverable report are produced and maintained in EnVision’s library. All data is archived onto CD and stored in the library along with the internal copies of the deliverables.
Competitive Network Benchmarking Sample Report - Albuquerque
2005 EnVision Wireless, Inc. - 18 - 8/26/2005 All Rights Reserved
Appendix The following geographic plots are attached to this report.
Albuquerque
Albuquerque Drive Routes
Call Failures (All Networks)
Cingular Call Performance
Cingular RXQual & Call Failures
Cingular RXLev & Call Failures
Cricket Call Performance
Cricket Forward FER & Call Failures
Cricket Aggregate Ec/Io & Call Failures
Cricket Mobile Receive Power & Call Failures
Cricket Mobile Transmit Power & Call Failures
Nextel Call Performance
Nextel SQE & Call Failures
Nextel Mobile Receive Power & Call Failures
Sprint Call Performance
Sprint FER & Call Failures
Sprint Aggregate Ec/Io & Call Failures
Sprint Mobile Receive Power & Call Failures
Sprint Mobile Transmit Power & Call Failures
T-Mobile Call Performance
T-Mobile RXQual & Call Failures
T-Mobile RXLev & Call Failures
Verizon Call Performance
Verizon FER & Call Failures
Verizon Aggregate Ec/Io & Call Failures
Verizon Mobile Receive Power & Call Failures
Verizon Mobile Transmit Power & Call Failures
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Albuquerque Drive Route
0 mi 2 4 6 8 10 12 14
Drive RouteAlbuquerque
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Call Failures (All Networks)
0 mi 2 4 6 8 10 12 14
Nextel Call FailuresBlocked
Dropped
No Service
Cingular Call FailuresBlocked
Dropped
No Service
Verizon Call FailuresBlocked
Dropped
No Service
Sprint Call FailuresBlocked
Dropped
No Service
T-Mobile Call FailuresBlocked
Dropped
No Service
Cricket Call FailuresBlocked
Dropped
No Service
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Cingular GSM Call Performance
0 mi 2 4 6 8 10 12
GSM Call PerformanceCompleted (442)Blocked (22)Dropped (2)No Service (0)
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Cingular GSM RXQual and Call Failures
0 mi 2 4 6 8 10 12 14
GSM Call FailuresBlocked
Dropped
No Service
GSM RxQual 6 to 74 to 52 to 30 to 1
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Cingular GSM RXLev and Call Failures
0 mi 2 4 6 8 10 12 14
GSM RxLev (dBm)>= -75-75 to -85-85 to -95-95 to -104< -104
GSM Call FailuresBlocked
Dropped
No Service
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Cricket CDMA Call Performance
0 mi 2 4 6 8 10 12 14
CDMA Call PerformanceCompleted (447)Blocked (9)Dropped (8)No Service (1)
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Cricket CDMA Forward FER and Call Failures
0 mi 2 4 6 8 10 12 14
CDMA Call FailuresBlocked
Dropped
No Service
Forward FER (%)> 43 to 41 to 20 to 1
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Cricket CDMA Aggregate EcIo and Call Failures
0 mi 2 4 6 8 10 12 14
Aggregate Ec/Io (dB)> -6-9 to -6-12 to -9< -12
CDMA Call FailuresBlocked
Dropped
No Service
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Cricket CDMA RX Power and Call Failures
0 mi 2 4 6 8 10 12 14
CDMA Rx Power (dBm)>= -75-75 to -85-85 to -95-95 to -104< -104
CDMA Call FailuresBlocked
Dropped
No Service
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Cricket CDMA TX Power and Call Failures
0 mi 2 4 6 8 10 12 14
CDMA Call FailuresBlocked
Dropped
No Service
CDMA Tx Power (dBm)>= 150 to 14-15 to -1< -16
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Nextel iDEN Call Performance
0 mi 2 4 6 8 10 12
iDEN Call PerformanceCompleted (446)Blocked (3)Dropped (2)No Service (0)
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Nextel iDEN SQE and Call Failures
0 mi 2 4 6 8 10 12 14
iDEN SQE (dB)> 3222 to 3210 to 22< 10
iDEN Call FailuresBlocked
Dropped
No Service
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Nextel iDEN RX Power and Call Failures
0 mi 2 4 6 8 10 12 14
iDEN Call FailuresBlocked
Dropped
No Service
iDEN RX Power (dBm)> -75-85 to -75-95 to -85-104 to -95< -104
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Sprint CDMA Call Performance
0 mi 2 4 6 8 10 12 14
CDMA Call PerformanceCompleted (441)Blocked (12)Dropped (2)No Service (2)
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Sprint CDMA Forward FER and Call Failures
0 mi 2 4 6 8 10 12 14
CDMA Call FailuresBlocked
Dropped
No Service
Forward FER (%)> 43 to 41 to 20 to 1
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Sprint CDMA Aggregate EcIo and Call Failures
0 mi 2 4 6 8 10 12
CDMA Call FailuresBlocked
Dropped
No Service
Aggregate Ec/Io (dB)>= -6-9.to -6-12.to -9< -12No Data Recorded
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Sprint CDMA RX Power and Call Failures
0 mi 2 4 6 8 10 12 14
CDMA Rx Power (dBm)>= -75-75 to -85-85 to -95-95 to -104< -104
CDMA Call FailuresBlocked
Dropped
No Service
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Sprint CDMA TX Power and Call Failures
0 mi 2 4 6 8 10 12 14
CDMA Call FailuresBlocked
Dropped
No Service
CDMA Tx Power (dBm)>= 150 to 14-15 to -1< -16
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
T-Mobile GSM Call Performance
0 mi 2 4 6 8 10 12 14
GSM Call PerformanceCompleted (416)Blocked (4)Dropped (5)No Service (1)
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
T-Mobile GSM RXQual and Call Failures
0 mi 2 4 6 8 10 12 14
GSM Call FailuresBlocked
Dropped
No service
GSM RxQual 6 to 74 to 52 to 30 to 1
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
T-Mobile GSM RXLev and Call Failures
0 mi 5 10 15
GSM Call FailuresBlocked
Dropped
No Service
GSM RxLev (dBm)>= -75-75 to -85-85 to -95-95 to -104< -104
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Verizon CDMA Call Performance
0 mi 2 4 6 8 10 12
CDMA Call PerformanceCompleted (463)Blocked (11)Dropped (11)No Service (2)
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Verizon CDMA Forward FER and Call Failures
0 mi 2 4 6 8 10 12 14
CDMA Call FailuresBlocked
Dropped
No Service
Forward FER (%)> 43 to 41 to 20 to 1
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Verizon CDMA Aggregate EcIo and Call Failures
0 mi 2 4 6 8 10 12 14
CDMA Call FailuresBlocked
Dropped
No Service
Aggregate Ec/Io (dB)> -6-9 to -6-12 to -9< -12
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Verizon CDMA RX Power and Call Failures
0 mi 2 4 6 8 10 12 14
CDMA Rx Power (dBm)>= -75-75 to -85-85 to -95-95 to -104< -104
CDMA Call FailuresBlocked
Dropped
No Service
Copyright © 1988-2003 Microsoft Corp. and/or its suppliers. All rights reserved. http://www.microsoft.com/mappoint© Copyright 2002 by Geographic Data Technology, Inc. All rights reserved. © 2002 Navigation Technologies. All rights reserved. This data includes information taken with permission from Canadian authorities © 1991-2002 Government of Canada (Statistics Canada and/or Geomatics Canada), all rights reserved.
Verizon CDMA TX Power and Call Failures
0 mi 2 4 6 8 10 12 14
CDMA Call FailuresBlocked
Dropped
No Service
CDMA Tx Power (dBm)>= 150 to 14-15 to -1< -16