Methodology Overview

Methodology Overview:

How Opensignal measures mobile
network experience

We report as accurately as possible the real-world mobile experience as recorded by mobile network users.

To make decisions based on our market insights, network operators, regulators, analysts and consumers need to know the facts we present are accurate and that the comparisons we make are valid.

Read our Methodology Overview below or download a PDF copy.

Read on to understand

  • The definitions of the metrics we use in our reports and insights
  • The principles that guide our collection of data from mobile phones around the world
  • The data processing techniques we use to analyze the billions of measurements we collect daily
  • The rigorous scientific analytics that guide the interpretation of our results

Models, simulations and predictive measures of network performance
cannot measure true user experience from the most relevant
information source, actual mobile phone users.

Brendan Gill, Co-Founder, Opensignal

Introduction

Opensignal’s objective is to report as accurately as possible the real-world mobile experience as recorded by mobile network users.

We strongly believe that:

  • What matters most when assessing network performance is how it is experienced by subscribers themselves. This is best understood by capturing the real-world experience of users as they interact with the network in the places where they live, work and travel.
  • By making this information transparently available, we address the important need for consumers to understand the most relevant aspect of network performance relative to them, namely how it translates to their user experience

Although operators have been monitoring performance from when the very first networks were built, there remains a disconnect between their engineering tests and the typical experience of everyday users.

We believe that the only way to bridge this gap is to measure the network using customers’ actual experience as a starting point. This user-centric approach provides a rich source of real-world user data from which mobile operators can accurately understand and evaluate the true network experience of their subscribers.

To make decisions based on our findings, consumers, regulators, network operators and other industry stakeholders need to be able to know that the facts we present are accurate and that the comparisons we make are valid. We openly share the principles that govern our business, the philosophy behind our experience measurements and the analysis behind them in our Independence Charter, Experience Charter and Analytics Charter.

This methodology document explains in more detail how we go about measuring mobile network experience and how we analyze our measurements to create the most accurate reports and insights about mobile experience globally.

Metrics in public reports

Opensignal’s public reports contain the set of metrics we believe best represent users’ experience of accessing real services. We continually invest in research and development to find new measures. For example, in 2018 we launched the telecom industry’s first independent measure of real-world mobile video experience, spanning multiple video content platforms.

We make more granular mobile analytics insights available to network operators to enable them to understand in greater detail, and ultimately improve, the network service they provide. This granular view can also be used by regulators and analysts. 

Our current list of metrics in public reports includes:

Video Experience

Opensignal’s Video Experience quantifies the quality of video streamed to mobile devices by measuring real-world video streams over an operator's networks. The metric is based on an International Telecommunication Union (ITU) approach, built upon detailed studies which have derived a relationship between technical parameters, including picture quality, video loading time and stall rate, with the perceived video experience as reported by real people. To calculate video experience, we are directly measuring video streams from end-user devices and using this ITU approach to quantify the overall video experience for each operator on a scale from 0 to 100. The videos tested include a mixture of resolutions — including Full HD (FHD) and 4K / Ultra HD (UHD) — and are streamed directly from the world’s largest video content providers.

  • Excellent (75 or above)
  • Very Good (65 or more but less than 75)
  • Good (55 or more but less than 65)
  • Fair (40 or more but less than 55)
  • Poor (Under 40)
     

5G Video Experience

The average Video Experience of Opensignal users when they were connected to an operator’s 5G network. 

Video Experience – 5G Users

The average Video Experience of Opensignal users with a 5G device and a 5G subscription across an operator's networks. It factors in 2G, 3G, 4G and 5G video experience along with the availability of each technology.

4G Video Experience

The average Video Experience of Opensignal users on an operator's 4G network. 

3G Video Experience

The average Video Experience of Opensignal users on an operator’s 3G network. 

    Games Experience

    Opensignal’s Games Experience measures how mobile users experience real-time multiplayer mobile gaming on an operator’s network. Measured on a scale of 0-100, it analyzes how our users’ multiplayer mobile gaming experience is affected by mobile network conditions including latency, packet loss and jitter.

    • Excellent (85 or above)
    • Good (75 or more but less than 85)
    • Fair (65 or more but less than 75)
    • Poor (40 or more but less than 65)
    • Very Poor (Under 40) 

     

    5G Games Experience

    The average Games Experience of Opensignal users when they were connected to an operator’s 5G network. 

    Games Experience – 5G Users

    The average Games Experience of Opensignal users with a 5G device and a 5G subscription across an operator's networks. It factors in 2G, 3G, 4G and 5G games experience along with the availability of each technology. 

    4G Games Experience

    The average Games Experience of Opensignal users on an operator's 4G network. 

    3G Games Experience

    The average Games Experience of Opensignal users on an operator's 3G (e.g. UMTS/HSPA or CDMA 1X EV-DO) network. 

      Voice App Experience

      Opensignal's Voice App Experience measures the quality of experience for over-the-top (OTT) voice services — mobile voice apps such as WhatsApp, Skype and Facebook Messenger — using a model derived from the International Telecommunication Union (ITU) approach for quantifying overall voice call quality and a series of calibrated technical parameters. This model characterizes the exact relationship between the technical measurements and perceived call quality. Voice App Experience for each operator is calculated on a scale from 0 to 100.

      • Excellent (95 or above) 
      • Very Good (87 or more but less than 95)
      • Good (80 or more but less than 87)
      • Acceptable (74 or more but less than 80)
      • Poor (66 or more but less than 74)
      • Very Poor (60 or more but less than 66)
      • Unintelligible (45 or more but less than 60) 
      • Impossible to communicate (Under 45)

       

      5G Voice App Experience

      The average Voice App Experience of Opensignal users when they were connected to an operator’s 5G network.

      Voice App Experience – 5G Users

      The average Voice App Experience of Opensignal users with a 5G device and a 5G subscription across an operator's networks. It factors in 2G, 3G, 4G and 5G voice app experience along with the availability of each technology.

      4G Voice App Experience

      The average Voice App Experience of Opensignal users on an operator's 4G network.

      3G Voice App Experience

      The average Voice App Experience of Opensignal users on an operator's 3G (e.g. UMTS/HSPA or CDMA 1X EV-DO) network.

      Group Video Calling Experience

      Opensignal’s Group Video Calling Experience metric represents the proportion of video calls where all users on a call had at least an adequate or better video conference experience. Measured on a scale of 0-100, it considers the video and audio quality of all users on the call.

      Download Speed Experience

      Measured in Mbps, Opensignal's Download Speed Experience represents the typical everyday speeds a user experiences across an operator’s mobile data networks.

      5G Download Speed

      The average download speed observed by Opensignal users with active 5G connections.

      Download Speed Experience – 5G Users

      The average download speeds experienced by Opensignal users with a 5G device and a 5G subscription across an operator’s networks. It factors in 2G, 3G, 4G, and 5G download speeds along with the availability of each technology.

      4G Download Speed

      The average downlink speed observed by Opensignal users when they were connected to 4G.

      3G Download Speed

      The average downlink speed observed by Opensignal users when they were connected to 3G (e.g. UMTS/HSPA or CDMA 1X EV-DO).

      Upload Speed Experience

      Measured in Mbps, Opensignal's Upload Speed Experience measures the average upload speeds for each operator observed by our users across their mobile data networks. 

      5G Upload Speed

      The average upload speed observed by Opensignal users with active 5G connections.

      Upload Speed Experience – 5G Users

      The average upload speeds experienced by Opensignal users with a 5G device and a 5G subscription across an operator’s networks. It factors in 2G, 3G, 4G, and 5G upload speeds along with the availability of each technology.

      4G Upload Speed

      The average uplink speed observed by Opensignal users when they were connected to 4G.

      3G Upload Speed

      The average uplink speed observed by Opensignal users when they were connected to 3G (e.g. UMTS/HSPA or CDMA 1X EV-DO).

      Availability

      Availability shows the proportion of time all Opensignal users on an operator’s network had either a 3G, 4G or 5G connection. Availability is not a measure of coverage or the geographic extent of a network.

      5G Availability

      5G Availability shows the proportion of time Opensignal users with a 5G device and a 5G subscription had an active 5G connection.

      4G Availability

      4G Availability shows the proportion of time Opensignal users with a 4G device and a 4G subscription — but have never connected to 5G — had a 4G connection.

      5G Reach

      5G Reach measures how users experience the geographical extent of an operator’s 5G network. It analyzes the average proportion of locations where users were connected to a 5G network out of all the locations those users have visited. 5G Reach for each operator is measured on a scale from 0 to 10.

      4G Coverage Experience

      4G Coverage Experience measures how mobile subscribers experience 4G coverage on an operator’s network. Measured on a scale of 0-10, it analyzes the locations where customers of a network operator received a 4G signal relative to the locations visited by users of all network operators.

      Excellent Consistent Quality

      Excellent Consistent Quality is the percentage of users’ tests that met the minimum recommended performance thresholds to watch HD video, complete group video conference calls and play games. Excellent Consistent Quality is calculated using data from our sister company Tutela and uses its methodology, full details of which can be found here.

      Core Consistent Quality

      Core Consistent Quality is the percentage of users’ tests that met the minimum recommended performance thresholds for lower performance applications including SD video, voice calls and web browsing. Core Consistent Quality is calculated using data from our sister company Tutela and uses its methodology, full details of which can be found here.

      How Opensignal collects and analyzes data

      Our process begins with collecting billions of measurements daily from over 100 million devices globally.

      Our software is installed within our own and partner apps. The partners we work with are strategically selected to cover a wide range of users, demographics, and devices.

      Some speed tests published by other sources use data only from their own apps, which restricts the sample to a certain type of user, and some restrict data collection to just the latest devices, which again limits the user base.

      We do not impose any such restrictions and aim for the widest sample base that most accurately reflects the make-up of the entire population.

      We take extensive measures to ensure that the privacy of mobile users is respected through the entire data collection process within our own and our partners’ apps. The details can be found in our Data Privacy Charter.

      The processing and analysis for all of the measurements we collect is based on best-in-class data science methods following the principles of our Experience Charter and our Analytics Charter.

      The process is designed to ensure that incorrect or potentially distorting data is not able to influence the results and that those results are shown in a way that can be clearly understood and relied upon.

      Collection – General principles

      • Opensignal collects billions of individual measurements every day from over 100 million devices worldwide, under conditions of normal usage, including measurements in both indoor and outdoor locations. Users spending most of their times in indoor locations mean that most of our measurements will be collected from indoor locations.
      • To calculate our video metric, we use video content providers selected to represent typical user experience. Our measurements are designed so that operators cannot optimize their networks to treat our traffic differently and therefore impact the results without making actual improvements to their networks.
      • Opensignal collects measurements of network speed based on both user-initiated tests and automated tests. The majority of measurements are generated through automated tests (no user interaction), executed independently and at random intervals to capture what users are experiencing at a typical moment in time. This approach is recognized as best practice by a number of official bodies including the FCC in the U.S.
      • Opensignal does not use dedicated test servers. We measure the end-to-end consumer network experience and the full path from the user device all the way to the Content Delivery Networks (CDNs) such as Google, Akamai and Amazon.

      Collection – Active speed measurements

      When any application downloads data there is an initial ramp-up period as the connection is established where the download speed will not be the same as the stable speed achieved once the download is in progress.

      On today’s networks, from a user experience point of view, speed-sensitive applications such as streaming video or large file download are influenced mainly by the stable speed (sometimes called the “goodput”), not the speed during the ramp-up time. Conversely applications such as web browsing are influenced heavily by the ramp-up time and by latency.

      Measuring Speed Experience: As the Opensignal Download Speed Experience metric is focused on measuring the user experience of these applications such as streaming and large file downloads, which are influenced primarily by the stable speed, we use a fixed time test, rather than a simple fixed file size download. A fixed time period enables the speed measured to be a much closer representative of the stable speed the user experiences through the application.

      Accurate comparisons: As well as being more representative, this fixed time approach makes comparisons between widely different network speeds more meaningful. A file of a few MB downloaded over a 3G network will take several seconds and the speed measured will be influenced primarily by the goodput and only slightly by the ramp-up. The download of the same file over an LTE-A network will take a much shorter time and the overall speed measured will be dominated by the ramp-up time and unrelated to the speed seen by a streaming application.

      Collection – Video Experience measurement

      The Opensignal Video Experience measurement directly streams sample video from typical content providers and measures a range of parameters that directly impact the user experience, such as the loading

      time (the time taken for the video to start) and the video stalling ratio (the proportion of users who experience an interruption in playback after video begins streaming) for different picture qualities or bit rates.

      Processing

      Opensignal uses a rigorous post-processing system that takes the raw measurements and calculates robust and representative metrics. This includes a number of steps to quality-assure the measurements.

      For example, if a user failed to download any content, this measurement is eliminated and treated as a “failed test” rather than being included in the average speed calculation.

      Also, when calculating metrics on a given network technology (e.g. 4G), measurements where a network type change is detected (e.g. from 4G to 3G) during the duration of the measurement are not included.

      Initial filtering

      We automatically filter out certain entries, (e.g. when a phone is in a call) which are known to produce non-typical results.

      Operator name mapping

      To ensure that the results only reflect the experience of customers who bought the operator’s own branded service, we remove results from Mobile Virtual Network Operator (MVNO) subscribers and subscribers who are roaming. These subscribers may be subject to different Quality of Service (QoS) restrictions than an operator’s own customers and so their experience may be different.

      Selection of network type

      We consolidate data into technology types — e.g. when considering 3G connections, we include HSDPA, HSUPA and UMTS R99 into one group.

      Scientific averaging

      We calculate a single average per device to ensure every device has an equal effect on the overall result. Essentially, we employ a “one device, one vote” policy in our calculations.

      Removing extreme values

      We eliminate a percentage of extreme high and low values. This removal of extremes is common data science practice and ensures the average calculated represents typical user experience.

      Analytics – Reporting

      Per Device Values: Per device values are combined using a simple average to yield the Opensignal metrics that are found in our reports and analysis. 

      Confidence Intervals: We provide an upper and lower estimate of confidence interval per operator, calculated using recognized standard techniques based on the sample size of measurements. Confidence intervals provide information on the margins of error or the precision in the metric calculations. They represent the range in which the true value is very likely to be, taking into account the entire range of data measurements.

      Statistically significant results: Whenever the confidence intervals for two or more operators overlap in a particular metric, the result is a statistical tie. This is because one operator's apparent lead may not hold true once measurement uncertainty is taken into account. For this reason, awards could have multiple winners in our reports.

      Standardized geographical boundaries: A common practice in reports from other sources is to “cherry-pick” geographic boundaries or time periods to be able to make an observation about a specific operator. For example, highlighting performance for a particular area of a city, or over a particular time period. We do not do this and only report on standardized geographical boundaries (where available) and over the entire period covered by the measurements. Our reporting timetable is under our control and not released to operators in advance. This ensures that reports represent the consistent experience of the majority of users.

      Analytics – Video Experience

      Opensignal’s Video Experience quantifies the quality of video streamed to mobile devices by measuring real-world video streams over an operator's networks. The metric is based on an International Telecommunication Union (ITU) approach, built upon detailed studies which have derived a relationship between technical parameters, including picture quality, video loading time and stall rate, with the perceived video experience as reported by real people.

      To calculate video experience, we are directly measuring video streams from end-user devices and using this ITU approach to quantify

      the overall video experience for each operator on a scale from 0 to 100. The videos tested include a mixture of resolutions — including Full HD (FHD) and 4K / Ultra HD (UHD) — and are streamed directly from the world’s largest video content providers.

      The following scale can be used to relate the Video Experience scores to the actual experience our users received:

      Excellent (75 or above)

      Very consistent experience across all users, video streaming providers and resolutions tested, with fast loading times and almost non-existent stalling.

      Very Good (65 or more but less than 75)

      Generally fast loading times and only occasional stalling but the experience might have been somewhat inconsistent across users and/or video providers/resolutions.

      Good (55 or more but less than 65)

      An acceptable but inconsistent experience, even from the same video streaming provider and particularly for higher resolutions, with noticeably slow loading times and stalling not being uncommon.

      Fair (40 or more but less than 55)

      Not a good experience either for higher resolution videos (very slow loading times and prolonged stalling) or for some video streaming providers. The experience on lower resolution videos from some providers might have been sufficient though.

      Poor (Under 40)

      Not a good experience even for lower resolution videos across all providers. Very slow loading times and frequent stalling were common.

      Analytics – Games Experience

      Opensignal’s Games Experience measures how mobile users experience real-time multiplayer mobile gaming on an operator’s network. Measured on a scale of 0-100, it analyzes how our users’ multiplayer mobile gaming experience is affected by mobile network conditions including latency, packet loss and jitter.

      Games Experience quantifies the experience when playing real-time multiplayer mobile games on mobile devices connected to servers located around the world. The approach is built on several years of research quantifying the relationship between technical network parameters and the gaming experience as reported by real mobile users. These parameters include latency (round trip time), jitter (variability of latency) and packet loss (the proportion of data packets that never reach their destination).

      Additionally, it considers multiple genres of multiplayer mobile games to measure the average sensitivity to network conditions. The games tested include some of the most popular real-time multiplayer mobile games (such as Fortnite, Pro Evolution Soccer and Arena of Valor) played around the world.

      Calculating Games Experience starts with measuring the end-to-end experience from users’ devices to internet end-points that host real games. The score is then measured on a scale from 0 to 100.

      Excellent (85 or above)

      The vast majority of users deemed this network experience acceptable. Nearly all respondents felt like they had control over the game and they received immediate feedback on their actions. There was not a noticeable delay in almost all cases.

      Good (75 or more but less than 85)

      Most users deemed the experience acceptable. The gameplay experience was generally controllable and the user received immediate feedback between their actions and the outcomes in the game. Most users did not experience a delay between their actions and the game.

      Fair (65 or more but less than 75)

      Users found the experience to be ‘average’. In most cases the game was responsive to the actions of the player with most users reporting that they felt like they had control over the game. The majority of players reported that they noticed a delay between their actions and the outcomes in the game.

      Poor (40 or more but less than 65)

      Most users found this level of experience unacceptable. The majority of users reported seeing a delay in the gameplay experience and they did not receive immediate feedback on their actions. Many users felt a lack of controllability.

      Very Poor (Under 40)

      Nearly all users found this level of experience unacceptable. Almost all users experienced a noticeable delay within the game, with most of them not feeling like they had control of the gameplay. The vast majority of players didn’t receive immediate feedback on their actions.

      Analytics – Voice App Experience

      Opensignal's Voice App Experience measures the quality of experience for over-the-top (OTT) voice services — mobile voice apps such as WhatsApp, Skype and Facebook Messenger — using a model derived from the International Telecommunication Union (ITU) approach for quantifying overall voice call quality and a series of calibrated technical parameters.

      This model characterizes the exact relationship between the technical measurements and perceived call quality. Voice App Experience for each operator is calculated on a scale from 0 to 100.

      Opensignal's Voice App Experience has the following categories:

      Excellent (95 or above)

      Most users are very satisfied. Operator provides consistently good OTT voice quality experience across the customer base. 

      Very Good (87 or more but less than 95)

      Most users are satisfied. Operator generally provides good OTT voice quality experience. Occasionally, there may be some impairments to the call, primarily related to level of loudness. 

      Good (80 or more but less than 87)

      Many users are satisfied. Minor quality impairments experienced by some users. Sometimes the background is not quite clear, it can be either hazy or not loud enough. Clicking sounds or distortion is very occasionally present. 

      Acceptable (74 or more but less than 80)

      Users are satisfied. Perceptible call quality impairments experienced by some users. Short duration of clicking sounds or distortion can be heard, and/or the volume may not be sufficiently loud. Listener is generally able to comprehend without repetition. 

      Poor (66 or more but less than 74)

      Many users dissatisfied. Call quality impairments experienced by many users. Distortion, clicking sounds or silence experienced during the call, which is perceptible and can be annoying.

      Very Poor (60 or more but less than 66)

      Most users dissatisfied. Significant call quality impairments experienced by most users. Occasional instances of distortion, clicking sounds or silence experienced during the call. It can be difficult to understand parts of the conversation without repetition.

      Unintelligible (45 or more but less than 60)

      Nearly all users are dissatisfied. Frequent instances of long pauses, clicking sounds or distortion can be heard during the call. Frequent repetition is required to be comprehensible, or there are frequent conversation overlaps. 

      Impossible to communicate (Under 45)

      Impossible to communicate.

      Analytics – Group Video Calling Experience

      Group Video Calling Experience measures the proportion of video calls where all users on a call had at least an adequate or better video conference experience.

      In simple terms, Group Video Calling Experience measures whether all users in a group video call – not just a small number of users – had both sufficient (or better) video and audio quality. It therefore takes into account that a poor experience for one or more users will impact all users on a conference call so having a consistent experience across all users on a group video call is important.

      Group Video Calling Experience uses measurements from our real-world video tests and our voice app calling tests. To calculate Group Video Calling Experience, we consider a range of scenarios that reflect typical numbers of call participants displayed during a smartphone video call – two, four and eight participants – to represent the real-world mobile video conference experience. Group Video Calling Experience for each operator is measured on a scale from 0 to 100.

      Analytics – Availability

      Opensignal's availability metrics are not measures of coverage or a network’s geographical extent. They won’t tell you whether you are likely to get a signal if you plan to visit a remote rural or nearly uninhabited region. Instead, they measure what proportion of time people have a network connection, in the places they most commonly frequent — something often missed by traditional coverage metrics. Looking at when users have a connection rather than where, provides us with a more precise reflection of the true user experience. 

      We also keep track of the instances that leave mobile users most frustrated: when there is no signal to connect to at all. The most common dead zones users struggle with occur indoors. As most of our availability data is collected indoors (as that’s where users spend most of their time), we’re particularly astute at detecting areas of zero signal.

      Opensignal's availability metrics take a user-centric, time-based approach that complements the user-centric and geographical-based methodology used by our reach metrics.

      Our Availability metric shows the proportion of time all Opensignal users on an operator’s network had either a 3G, 4G or 5G connection.

      4G Availability shows the proportion of time Opensignal users with a 4G device and a 4G subscription — but have never connected to 5G — had a 4G connection.

      5G Availability shows the proportion of time Opensignal users with a 5G device and a 5G subscription had an active 5G connection.

      Analytics – Reach

      Opensignal’s reach metrics measure how mobile users experience the geographical extent of an operator’s network. They analyze the average proportion of locations where users were connected to a network out of all the locations those users have visited. 

      In simple terms, reach metrics measure the mobile experience in all the locations that matter most to everyday users — i.e. all the places where they live, work and travel. Our reach metrics for each operator are measured on a scale from 0 to 10.

      Opensignal’s reach metrics provide a user-centric, geographical-based measure to complement the user-centric, time-based measure provided by availability metrics.

      5G Reach measures how users experience the geographical extent of an operator’s 5G network. It analyzes the average proportion of locations where users were connected to a 5G network out of all the locations those users have visited. In simple terms, 5G Reach measures the 5G mobile experience in all the locations that matter most to everyday users – i.e. all the places where they live, work and travel. 5G Reach for each operator is measured on a scale from 0 to 10.

      Analytics – 4G Coverage Experience

      4G Coverage Experience measures how mobile subscribers experience 4G coverage on an operator’s network. Measured on a scale of 0-10, it analyzes the locations where customers of a network operator received a 4G signal relative to the locations visited by users of all network operators.

      In simple terms, 4G Coverage Experience measures the mobile coverage experience in all the locations that matter most to everyday users — i.e. all the places where they live, work and travel. It considers all the areas

      that Opensignal users visit, the portion of locations that 4G is available to them, and locations that more users visit have higher importance to them.

      This differs from other predictive measures of network service coverage that are based either on population or geography.

      Analytics – Consistent Quality

      Consistent Quality measures how often users' experience on a network was sufficient to support common applications' requirements. It measures download speed, upload speed, latency, jitter, packet loss, time to first byte and the percentage of tests attempted which did not succeed due to a connectivity issue on either the download or server response component. 

      The Consistent Quality metrics – Excellent Consistent Quality and Core Consistent Quality - are measured using data from our sister company Tutela and uses its methodology, full details of which can be found here.

      Excellent Consistent Quality is the percentage of users' tests that met the minimum recommended performance thresholds to watch HD video, complete group video conference calls and play games.

      Core Consistent Quality is the percentage of users' tests that met the minimum recommended performance thresholds for lower performance applications including SD video, voice calls and web browsing.

      Summary

      Our measurements are designed to capture as accurately as possible the experience of typical real users and are subject to all the factors that affect real user traffic, representing a wide base of users and devices.

      Our scientific analysis processes these measurements to create the most accurate possible picture of user experience and how it varies between operators, regions and countries, as far as possible based only on measurements from real users.

      Our reports present our findings objectively, with confidence intervals shown and only drawing statistically significant conclusion.

      Because of this, our reports are recognized as the most trusted source of mobile network experience around the world.

      Learn more about the principles that govern our business in the Opensignal Manifesto.