Models, simulations and predictive measures of network performance
cannot measure true user experience from the most relevant
information source, actual mobile phone users.
Our Methodology Overview includes:
(click to jump to a section)
Opensignal’s objective is to report as accurately as possible the real-world mobile experience as recorded by mobile network users.
We strongly believe that:
- What matters most when assessing network performance is how it is experienced by subscribers themselves. This is best understood by capturing the real-world experience of users as they interact with the network in the places where they live, work and travel.
- By making this information transparently available, we address the important need for consumers to understand the most relevant aspect of network performance relative to them, namely how it translates to their user experience
Although operators have been monitoring performance from when the very first networks were built, there remains a disconnect between their engineering tests and the typical experience of everyday users.
We believe that the only way to bridge this gap is to measure the network using customers’ actual experience as a starting point. This user-centric approach provides a rich source of real-world user data from which mobile operators can accurately understand and evaluate the true network experience of their subscribers.
In order to make decisions based on our findings, consumers, regulators, network operators and other industry stakeholders need to be able to know that the facts we present are accurate and that the comparisons we make are valid. We openly share the principles that govern our business, the philosophy behind our experience measurements and the analysis behind them in our Independence Charter, Experience Charter and Analytics Charter.
This methodology document explains in more detail how we go about the measurement of mobile network experience and how we analyze those measurements to create the most accurate reports and insights about mobile experience globally.
Metrics in Public Reports
Opensignal’s public reports contain the set of metrics we believe best represent users’ experience of accessing real services. We continually invest in research and development to find new measures. For example, in 2018 we launched the telecom industry’s first independent measure of real-world mobile video experience, spanning multiple video content platforms.
We make more granular mobile analytics insights available to network operators in order for them to understand in greater detail, and ultimately improve, the network service they provide. This granular view can also be used by regulators and analysts. Our current list of metrics in public reports is:
Our current list of metrics in public reports is:
Measures the average proportion of time Opensignal users spend with a 4G connection on each operator’s network.
Measures the average video experience of Opensignal users on 3G and 4G networks for each operator. Our methodology involves measuring real-world video streams and uses an ITU-based approach for determining video quality. The metric calculation takes picture quality, video loading time and stall rate into account. We report video experience on a scale of 0-100, with scores falling into the following categories:
- 75 or above Excellent
- 65 < 75 Very Good
- 55 < 65 Good
- 40 < 55 Fair
- Under 40 Poor
3G Video Experience
Measures the average video experience of Opensignal users on 3G networks for each operator. Our methodology involves measuring real-world video streams and uses an ITU-based approach for determining video quality. The metric calculation takes picture quality, video loading time and stall rate into account. We report video experience on a scale of 0-100, with scores falling into the following categories:
- 75 or above Excellent
- 65< 75 Very Good
- 55 < 65 Good
- 40 < 55 Fair
- Under 40 Poor
4G Video Experience
Measures the average video experience of Opensignal users on 4G networks for each operator. Our methodology involves measuring real-world video streams and uses an ITU-based approach for determining video quality. The metric calculation takes picture quality, video loading time and stall rate into account. We report video experience on a scale of 0-100, with scores falling into the following categories:
- 75 or above Excellent
- 65< 75 Very Good
- 55 < 65 Good
- 40 < 55 Fair
- Under 40 Poor
Voice App Experience
Measures the quality of experience for over-the-top (OTT) voice services — mobile voice apps such as WhatsApp, Skype, Facebook Messenger etc. — using a model derived from the International Telecommunication Union (ITU)-based approach for quantifying overall voice call quality and a series of calibrated technical parameters. This model characterizes the exact relationship between the technical measurements and perceived call quality. Voice App Experience for each operator is calculated on a scale from 0 to 100.
- 95 or above Excellent - Most users very satisfied
- 87 < 95 Very Good - Most users satisfied
- 80 < 87 Good - Many users satisfied
- 74 < 80 Acceptable - Users satisfied
- 66 < 74 Poor - Many users dissatisfied
- 60 < 66 Very Poor - Most users dissatisfied
- 45 < 60 Unintelligible - Nearly all users dissatisfied
- Under 45 Impossible to communicate
4G Voice App Experience
This metric quantifies the quality of experience over mobile voice services for each operator on LTE connections as experienced by Opensignal users.
3G Voice App Experience
This metric quantifies the quality of experience over mobile voice services for each operator on 3G connections as experienced by Opensignal users.
Measures the average latency experienced by Opensignal users across an operator's 3G and 4G networks. Latency, measured in milliseconds, is the delay users experience as data makes a round trip through the network. A lower score in this metric is a sign of a more responsive network.
Measures the average latency for each operator on 4G connections as experienced by Opensignal users.
Measures the average latency for each operator on 3G connections as experienced by Opensignal users.
Download Speed Experience
Measures the average download speed experienced by Opensignal users across an operator's 3G and 4G networks. It doesn't just factor in 3G and 4G speeds, but also the availability of each network technology. Operators with lower 4G Availability tend to have a lower Download Speed Experience because their customers spend more time connected to slower 3G networks.
3G Download Speed
Measures the average download speed for each operator on 3G connections as experienced by Opensignal users.
4G Download Speed
Measures the average download speed for each operator on 4G connections as experienced by Opensignal users.
Peak Download Speed
Measures the fastest download speeds users experience on real-world networks. Note, this is different to the best-case speeds measured in idealized conditions or theoretical maximum speeds that users themselves will never receive in a real-world situation. To assess Peak Download Speed, we use the speed experienced by the 98th percentile of users.
Upload Speed Experience
Measures the average upload speed experienced by Opensignal users across an operator's 3G and 4G networks. Upload Speed Experience doesn't just factor in 3G and 4G speeds, but also the availability of each network technology. Operators with lower 4G Availability tend to have a lower Upload Speed Experience because their customers spend more time connected to slower 3G networks.
3G Upload Speed
Measures the average upload speed for each operator on 3G connections as experienced by Opensignal users.
4G Upload Speed
Measures the average upload speed for each operator on 4G connections as experienced by Opensignal users.
Peak Upload Speed
Measures the fastest upload speeds users experience on real-world networks. Note, this is different to the best-case speeds measured in idealized conditions or theoretical maximum speeds that users themselves will never receive in a real-world situation. To assess Peak Upload Speed, we use the speed experienced by the 98th percentile of users.
How Opensignal Collects and Analyzes Data
Our process begins with collecting billions of measurements daily from over 100 million devices globally.
Our software is installed within our own and partner apps. The partners we work with are strategically selected to cover a wide range of users, demographics, and devices.
Some speed tests published by other sources use data only from their own apps, which restricts the sample to a certain type of user, and some restrict data collection to just the latest devices, which again limits the user base.
We do not impose any such restrictions and aim for the widest sample base that most accurately reflects the make-up of the entire population.
We take extensive measures to ensure that the privacy of mobile users is respected through the entire data collection process within our own and our partners’ apps. The details can be found in our Data Privacy Charter.
The process is designed to ensure that incorrect or potentially distorting data is not able to influence the results and that those results are shown in a way that can be clearly understood and relied upon.
Collection – General principles
- Opensignal collects billions of individual measurements every day from over 100 million devices worldwide, under conditions of normal usage, including measurements in both indoor and outdoor locations. As users spend most of their time in indoor locations, most of our measurements will correspondingly be collected from indoor locations.
- To calculate our video metric, we use video content providers selected to represent typical user experience. Our measurements are designed so that operators cannot optimize their networks to treat our traffic differently and therefore impact the results without making actual improvements to their networks.
- Opensignal collects measurements of network speed based on both user-initiated tests and automated tests. The majority of measurements are generated through automated tests (no user interaction), executed independently and at random intervals to capture what users are experiencing at a typical moment in time. This approach is recognized as best practice by a number of official bodies including the FCC in the U.S.
- Opensignal does not use dedicated test servers. We measure the end-to-end consumer network experience and the full path from the user device all the way to the Content Delivery Networks (CDNs) such as Google, Akamai and Amazon.
Collection – Active speed measurements
When any application downloads data there is an initial ramp-up period as the connection is established where the download speed will not be the same as the stable speed achieved once the download is in progress.
On today’s networks, from a user experience point of view, speed-sensitive applications such as streaming video or large file download are influenced mainly by the stable speed (sometimes called the “goodput”), not the speed during the ramp-up time. Conversely applications such as web browsing are influenced heavily by the ramp-up time and by latency.
Measuring Speed Experience:
As the Opensignal Download Speed Experience metric is focused on measuring the user experience of these applications such as streaming and large file downloads, which are influenced primarily by the stable speed, we use a fixed time test, rather than a simple fixed file size download.
A fixed time period enables the speed measured to be a much closer representative of the stable speed the user experiences through the application.
As well as being more representative, this fixed time approach makes comparisons between widely different network speeds more meaningful. A file of a few MB downloaded over a 3G network will take several seconds and the speed measured will be influenced primarily by the goodput and only slightly by the ramp-up. The download of the same file over an LTE-A network will take a much shorter time and the overall speed measured will be dominated by the ramp-up time and unrelated to the speed seen by a streaming application.
Collection – Video Experience measurement
The Opensignal Video Experience measurement directly streams sample video from typical content providers and measures a range of parameters that directly impact the user experience, such as the loading time (the time taken for the video to start) and the video stalling ratio (the proportion of users who experience an interruption in playback after video begins streaming) for different picture qualities or bit rates.
Accounting for traffic management:
The use of real video traffic is key to this measurement. Many operators around the world are starting to handle video traffic differently to other traffic at various points in the network, applying traffic shaping intended to maximize the number of users who get a good experience (some of these techniques are sometimes called "video throttling" but this is an oversimplification). As our measurement uses real video streamed from typical content providers the impact of all of these practices will be captured.
Opensignal uses a rigorous post-processing system that takes the raw measurements and calculates robust and representative metrics. This includes a number of steps to quality-assure the measurements.
For example, if a user failed to download any content, this measurement is eliminated and treated as a “failed test” rather than being included in the average speed calculation.
Also, when calculating metrics on a given network technology (e.g. 4G), measurements where a network type change is detected (e.g. from 4G to 3G) during the duration of the measurement are not included.
We automatically filter out certain entries, (e.g. when a phone is in a call) which are known to produce non-typical results.
Operator name mapping
To ensure that the results only reflect the experience of customers who bought the operator’s own branded service, we remove results from Mobile Virtual Network Operator (MVNO) subscribers and subscribers who are roaming. These subscribers may be subject to different Quality of Service (QoS) restrictions than an operator’s own customers and so their experience may be different.
Selection of network type
We consolidate data into technology types — e.g. when considering 3G connections, we include HSDPA, HSUPA and UMTS R99 into one group.
We calculate a single average per device to ensure every device has an equal effect on the overall result. Essentially, we employ a “one device, one vote” policy in our calculations.
Removing extreme values
We eliminate a percentage of extreme high and low values. This removal of extremes is common data science practice and ensures the average calculated represents typical user experience.
Analytics – Reporting
Per Device Values:
Per device values are combined using a simple average to yield the Opensignal metrics that are found in our reports and analysis. For each of our reports, we show details of the active userbase we have sampled from to calculate the results. The sample size will differ for each mobile operator, as users can only measure their own network providers’ performance.
We provide an upper and lower estimate of confidence interval per operator, calculated using recognized standard techniques based on the sample size of measurements. Confidence intervals provide information on the margins of error or the precision in the metric calculations. They represent the range in which the true value is very likely to be, taking into account the entire range of data measurements.
Statistically significant results:
Whenever the confidence intervals for two or more operators overlap in a particular metric, the result is a statistical tie. This is because one operator's apparent lead may not hold true once measurement uncertainty is taken into account. For this reason, awards could have multiple winners in our reports.
Standardized geographical boundaries:
A common practice in reports from other sources is to “cherry-pick” geographic boundaries or time periods to be able to make an observation about a specific operator. For example, highlighting performance for a particular area of a city, or over a particular time period. We do not do this and only report on standardized geographical boundaries (where available) and over the entire period covered by the measurements. Our reporting timetable is under our control and not released to operators in advance. This ensures that reports represent the consistent experience of the majority of users.
Analytics – Video Experience
The various parameters measured by the Video Experience metric are combined using an algorithm based on ITU recommendations to calculate a Mean Opinion Score (MOS), which is then translated to a scale of 0-100.
Our Insights Solutions used by operators, regulators and analysts allow the individual parameters to be studied as well as the overall MOS.
75 or above
Excellent: Very consistent experience across all users, video streaming providers and resolutions tested, with fast loading times and almost non-existent stalling.
65 < 75
Very Good: Generally fast loading times and only occasional stalling but the experience might be somewhat inconsistent across users and/or video providers/resolutions.
55 < 65
Good: Less consistent experience, even from the same video streaming provider and particularly for higher resolutions, with noticeably slower loading times and stalling not being uncommon.
40 < 55
Fair: Not a good experience either for higher resolution videos (very slow loading times and prolonged stalling) or for some video streaming providers. Experience on lower resolution videos from some providers might be sufficient though.
Poor: Not a good experience even for lower resolution videos across all providers. Very slow loading times and frequent stalling is common.
Analytics – Voice App Experience
Opensignal’s Voice App Experience results quantify the overall voice app experience for each operator on a scale from 0 to 100. The following can be used to relate the Voice App Experience scores to end-users’ mobile experience:
95 or above
Excellent: Most users are very satisfied. Operator provides consistently good OTT voice quality experience across the customer base.
87 < 95
Very Good: Most users are satisfied. Operator generally provides good OTT voice quality experience. Occasionally, there may be some impairments to the call, primarily related to level of loudness.
80 < 87
Good: Many users are satisfied. Minor quality impairments experienced by some users. Sometimes the background is not quite clear, it can be either hazy or not loud enough. Clicking sounds or distortion is very occasionally present.
74 < 80
Acceptable: Users are satisfied. Perceptible call quality impairments experienced by some users. Short duration of clicking sounds or distortion can be heard, and/or the volume may not be sufficiently loud. Listener is generally able to comprehend without repetition.
66 < 74
Poor: Many users Dissatisfied. Call quality impairments experienced by many users. Distortion, clicking sounds or silence experienced during the call, which is perceptible and can be annoying.
60 < 66
Very Poor: Most users Dissatisfied. Significant call quality impairments experienced by most users. Occasional instances of distortion, clicking sounds or silence experienced during the call. It can be difficult to understand parts of the conversation without repetition.
45 < 60
Unintelligible: Nearly all users are dissatisfied. Frequent instances of long pauses, clicking sounds or distortion can be heard during the call. Frequent repetition is required to be comprehensible, or there are frequent conversation overlaps.
Impossible to communicate.
Analytics – 4G Availability
Our 4G Availability metric shows the proportion of time users with a 4G device and subscription have a 4G connection. For example, a reported 4G Availability of 75%, means that the 4G users were, on average, connected to 4G services on their network 75% of the time.
Availability is not a measure of coverage or the geographic extent of a network, it measures what proportion of time users have a network connection.
Our measurements are designed to capture as accurately as possible the experience of typical real users and are subject to all the factors that affect real user traffic, representing a wide base of users and devices.
Our scientific analysis processes these measurements to create the most accurate possible picture of user experience and how it varies between operators, regions and countries, as far as possible based only on measurements from real users.
Our reports present our findings objectively, with confidence intervals shown and only drawing statistically significant conclusion.
Because of this, our reports are recognized as the most trusted source of mobile network experience around the world.
Learn more about the principles that govern our business in the Opensignal Manifesto.