Skip to main content

Methodology Overview

Methodology Overview:

How Opensignal measures mobile                    
network experience

We report as accurately as possible the real-world mobile experience as recorded by mobile network users.

To make decisions based on our market insights, network operators, regulators, analysts and consumers need to know the facts we present are accurate and that the comparisons we make are valid.

Read our Methodology Overview below or download a PDF copy.

Read on to understand

  • The definitions of the metrics we use in our reports and insights
  • The principles that guide our collection of data from mobile phones around the world
  • The data processing techniques we use to analyze the billions of measurements we collect daily
  • The rigorous scientific analytics that guide the interpretation of our results

Models, simulations and predictive measures of network performance                    
cannot measure true user experience from the most relevant                    
information source, actual mobile phone users.

Introduction

Opensignal’s objective is to report as accurately as possible the real-world mobile experience as recorded by mobile network users.

We strongly believe that:

  • What matters most when assessing network performance is how it is experienced by subscribers themselves. This is best understood by capturing the real-world experience of users as they interact with the network in the places where they live, work and travel.
  • By making this information transparently available, we address the important need for consumers to understand the most relevant aspect of network performance relative to them, namely how it translates to their user experience

Although operators have been monitoring performance from when the very first networks were built, there remains a disconnect between their engineering tests and the typical experience of everyday users.

We believe that the only way to bridge this gap is to measure the network using customers’ actual experience as a starting point. This user-centric approach provides a rich source of real-world user data from which mobile operators can accurately understand and evaluate the true network experience of their subscribers.

To make decisions based on our findings, consumers, regulators, network operators and other industry stakeholders need to be able to know that the facts we present are accurate and that the comparisons we make are valid. We openly share the principles that govern our business, the philosophy behind our experience measurements and the analysis behind them in our Independence Charter, Experience Charter and Analytics Charter.

This methodology document explains in more detail how we go about measuring mobile network experience and how we analyze our measurements to create the most accurate reports and insights about mobile experience globally.

Metrics in public reports

Opensignal’s public reports contain the set of metrics we believe best represent users’ experience of accessing real services. We continually invest in research and development to find new measures. For example, in 2018 we launched the telecom industry’s first independent measure of real-world mobile video experience, spanning multiple video content platforms.

We make more granular mobile analytics insights available to network operators to enable them to understand in greater detail, and ultimately improve, the network service they provide. This granular view can also be used by regulators and analysts. 

Our current list of metrics in public reports includes:

Video Experience

Opensignal’s Video Experience quantifies the quality of video streamed to mobile devices by measuring real-world video streams over an operator's networks. The metric is based on an International Telecommunication Union (ITU) approach, built upon detailed studies which have derived a relationship between technical parameters, including picture quality, video loading time and stall rate, with the perceived video experience as reported by real people. To calculate video experience, we are directly measuring video streams from end-user devices and using this ITU approach to quantify the overall video experience for each operator on a scale from 0 to 100. The videos tested include a mixture of resolutions — including Full HD (FHD) and 4K / Ultra HD (UHD) — and are streamed directly from the world’s largest video content providers.

  • Excellent (78 or above)
  • Very Good (68 or more but less than 78)
  • Good (58 or more but less than 68)
  • Fair (48 or more but less than 58)
  • Poor (Under 48)                    
     

5G Video Experience

The average Video Experience of Opensignal users when they were connected to an operator’s 5G network. 

Video Experience – 5G Users

The average Video Experience of Opensignal users with a 5G device and a 5G subscription across an operator's networks. It factors in 2G, 3G, 4G and 5G video experience along with the availability of each technology.

4G Video Experience

The average Video Experience of Opensignal users on an operator's 4G network. 

3G Video Experience

The average Video Experience of Opensignal users on an operator’s 3G network. 

Live Video Experience

Opensignal’s Live Video Experience quantifies the quality of real-time video streamed to mobile devices by measuring video streams over an operator's network. The metric extends the existing International Telecommunication Union (ITU) approach used for Opensignal's on-demand Video Experience metric, built upon detailed studies which have derived a relationship between technical parameters, including live playback offset, picture quality, video loading time and stall rate, with the perceived live video experience as reported by real people. To calculate live video experience, we are directly measuring live video streams from end-user devices and using this extension of ITU's approach to quantify the overall live video experience for each operator on a scale from 0 to 100. The videos tested include a mixture of resolutions and are streamed directly from the world’s largest video content providers.

  • Excellent (58 or above)
  • Very Good (53 or more but less than 58)
  • Good (43 or more but less than 53)
  • Fair (33 or more but less than 43
  • Poor (under 33)                    
     

5G Live Video Experience

The average Live Video Experience of Opensignal users when they were connected to an operator’s 5G network.

Live Video Experience - 5G Users

The average Live Video Experience of Opensignal users with a 5G device and a 5G subscription across an operator's networks. It factors in 2G, 3G, 4G and 5G video experience along with the availability of each technology.

4G Live Video Experience

The average Live Video Experience of Opensignal users on an operator's 4G network. 

3G Live Video Experience

The average Live Video Experience of Opensignal users on an operator’s 3G network.

Games Experience

Opensignal’s Games Experience measures how mobile users experience real-time multiplayer mobile gaming on an operator’s network. Measured on a scale of 0-100, it analyzes how our users’ multiplayer mobile gaming experience is affected by mobile network conditions including latency, packet loss and jitter.

  • Excellent (85 or above)
  • Good (75 or more but less than 85)
  • Fair (65 or more but less than 75)
  • Poor (40 or more but less than 65)
  • Very Poor (Under 40) 

 

5G Games Experience

The average Games Experience of Opensignal users when they were connected to an operator’s 5G network. 

Games Experience – 5G Users

The average Games Experience of Opensignal users with a 5G device and a 5G subscription across an operator's networks. It factors in 2G, 3G, 4G and 5G games experience along with the availability of each technology. 

4G Games Experience

The average Games Experience of Opensignal users on an operator's 4G network. 

3G Games Experience

The average Games Experience of Opensignal users on an operator's 3G (e.g. UMTS/HSPA or CDMA 1X EV-DO) network. 

Voice App Experience

Opensignal's Voice App Experience measures the quality of experience for over-the-top (OTT) voice services — mobile voice apps such as WhatsApp, Skype and Facebook Messenger — using a model derived from the International Telecommunication Union (ITU) approach for quantifying overall voice call quality and a series of calibrated technical parameters. This model characterizes the exact relationship between the technical measurements and perceived call quality. Voice App Experience for each operator is calculated on a scale from 0 to 100.

  • Excellent (95 or above) 
  • Very Good (87 or more but less than 95)
  • Good (80 or more but less than 87)
  • Acceptable (74 or more but less than 80)
  • Poor (66 or more but less than 74)
  • Very Poor (60 or more but less than 66)
  • Unintelligible (45 or more but less than 60) 
  • Impossible to communicate (Under 45)

 

5G Voice App Experience

The average Voice App Experience of Opensignal users when they were connected to an operator’s 5G network.

Voice App Experience – 5G Users

The average Voice App Experience of Opensignal users with a 5G device and a 5G subscription across an operator's networks. It factors in 2G, 3G, 4G and 5G voice app experience along with the availability of each technology.

4G Voice App Experience

The average Voice App Experience of Opensignal users on an operator's 4G network.

3G Voice App Experience

The average Voice App Experience of Opensignal users on an operator's 3G (e.g. UMTS/HSPA or CDMA 1X EV-DO) network.

Group Video Calling Experience

Opensignal’s Group Video Calling Experience metric represents the proportion of video calls where all users on a call had at least an adequate or better video conference experience. Measured on a scale of 0-100, it considers the video and audio quality of all users on the call.

Download Speed Experience

Measured in Mbps, Opensignal's Download Speed Experience represents the typical everyday speeds a user experiences across an operator’s mobile data networks.

5G Download Speed

The average download speed observed by Opensignal users with active 5G connections.

Download Speed Experience – 5G Users

The average download speeds experienced by Opensignal users with a 5G device and a 5G subscription across an operator’s networks. It factors in 2G, 3G, 4G, and 5G download speeds along with the availability of each technology.

4G Download Speed

The average downlink speed observed by Opensignal users when they were connected to 4G.

3G Download Speed

The average downlink speed observed by Opensignal users when they were connected to 3G (e.g. UMTS/HSPA or CDMA 1X EV-DO).

Upload Speed Experience

Measured in Mbps, Opensignal's Upload Speed Experience measures the average upload speeds for each operator observed by our users across their mobile data networks. 

5G Upload Speed

The average upload speed observed by Opensignal users with active 5G connections.

Upload Speed Experience – 5G Users

The average upload speeds experienced by Opensignal users with a 5G device and a 5G subscription across an operator’s networks. It factors in 2G, 3G, 4G, and 5G upload speeds along with the availability of each technology.

4G Upload Speed

The average uplink speed observed by Opensignal users when they were connected to 4G.

3G Upload Speed

The average uplink speed observed by Opensignal users when they were connected to 3G (e.g. UMTS/HSPA or CDMA 1X EV-DO).

Availability

Availability shows the proportion of time all Opensignal users on an operator’s network had either a 3G, 4G or 5G connection. Availability is not a measure of a network's geographic extent.

5G Availability

5G Availability shows the proportion of time Opensignal users with a 5G device and a 5G subscription had an active 5G connection.

4G Availability

4G Availability shows the proportion of time Opensignal users with a 4G device and a 4G subscription — but have never connected to 5G — had a 4G connection.

5G Reach

5G Reach measures how users experience the geographical extent of an operator’s 5G network. It analyzes the average proportion of locations where users were connected to a 5G network out of all the locations those users have visited. 5G Reach for each operator is measured on a scale from 0 to 10.

Coverage Experience

The Opensignal Coverage Experience metric measures the extent of mobile networks in the places people live, work and travel. The metric represents the experience users receive as they travel around areas where they would reasonably expect to find coverage.

Consistent Quality

Consistent Quality measures if the network is sufficient to support common mobile application requirements at a level that is ‘good enough’ for users to maintain (or complete) various typical tasks on their devices.

Reliability Experience

The Opensignal Reliability Experience Metric measures the ability of Opensignal users to connect to and successfully complete tasks on communication service providers’ (CSP) networks.

How Opensignal collects and analyzes data

Our process begins with collecting billions of measurements daily from over 100 million devices globally.

Our software is installed within our own and partner apps. The partners we work with are strategically selected to cover a wide range of users, demographics, and devices.

Some speed tests published by other sources use data only from their own apps, which restricts the sample to a certain type of user, and some restrict data collection to just the latest devices, which again limits the user base.

We do not impose any such restrictions and aim for the widest sample base that most accurately reflects the make-up of the entire population.

We take extensive measures to ensure that the privacy of mobile users is respected through the entire data collection process within our own and our partners’ apps. The details can be found in our Data Privacy Charter.

The processing and analysis for all of the measurements we collect is based on best-in-class data science methods following the principles of our Experience Charter and our Analytics Charter.

The process is designed to ensure that incorrect or potentially distorting data is not able to influence the results and that those results are shown in a way that can be clearly understood and relied upon.

Collection – General principles

  • Opensignal collects billions of individual measurements every day from over 100 million devices worldwide, under conditions of normal usage, including measurements in both indoor and outdoor locations. Users spending most of their times in indoor locations mean that most of our measurements will be collected from indoor locations.
  • To calculate our video metric, we use video content providers selected to represent typical user experience. Our measurements are designed so that operators cannot optimize their networks to treat our traffic differently and therefore impact the results without making actual improvements to their networks.
  • Opensignal collects measurements of network speed based on both user-initiated tests and automated tests. The majority of measurements are generated through automated tests (no user interaction), executed independently and at random intervals to capture what users are experiencing at a typical moment in time. This approach is recognized as best practice by a number of official bodies including the FCC in the U.S.

  • Opensignal does not use dedicated test servers. We measure the end-to-end consumer network experience and the full path from the user device all the way to the Content Delivery Networks (CDNs) such as Google, Akamai and Amazon.

Collection – Active speed measurements

When any application downloads data there is an initial ramp-up period as the connection is established where the download speed will not be the same as the stable speed achieved once the download is in progress.

On today’s networks, from a user experience point of view, speed-sensitive applications such as streaming video or large file download are influenced mainly by the stable speed (sometimes called the “goodput”), not the speed during the ramp-up time. Conversely applications such as web browsing are influenced heavily by the ramp-up time and by latency.

Measuring Speed Experience: As the Opensignal Download Speed Experience metric is focused on measuring the user experience of these applications such as streaming and large file downloads, which are influenced primarily by the stable speed, we use a fixed time test, rather than a simple fixed file size download. A fixed time period enables the speed measured to be a much closer representative of the stable speed the user experiences through the application.

Accurate comparisons: As well as being more representative, this fixed time approach makes comparisons between widely different network speeds more meaningful. A file of a few MB downloaded over a 3G network will take several seconds and the speed measured will be influenced primarily by the goodput and only slightly by the ramp-up. The download of the same file over an LTE-A network will take a much shorter time and the overall speed measured will be dominated by the ramp-up time and unrelated to the speed seen by a streaming application.

Collection – Video Experience measurement

The Opensignal Video Experience measurement directly streams sample video from typical content providers and measures a range of parameters that directly impact the user experience, such as the loading

time (the time taken for the video to start) and the video stalling ratio (the proportion of users who experience an interruption in playback after video begins streaming) for different picture qualities or bit rates.

Processing

Opensignal uses a rigorous post-processing system that takes the raw measurements and calculates robust and representative metrics. This includes a number of steps to quality-assure the measurements.

For example, if a user failed to download any content, this measurement is eliminated and treated as a “failed test” rather than being included in the average speed calculation.

Also, when calculating metrics on a given network technology (e.g. 4G), measurements where a network type change is detected (e.g. from 4G to 3G) during the duration of the measurement are not included.

Initial filtering

We automatically filter out certain entries, (e.g. when a phone is in a call) which are known to produce non-typical results.

Operator name mapping

To ensure that the results only reflect the experience of customers who bought the operator’s own branded service, we remove results from Mobile Virtual Network Operator (MVNO) subscribers and subscribers who are roaming. These subscribers may be subject to different Quality of Service (QoS) restrictions than an operator’s own customers and so their experience may be different.

Selection of network type

We consolidate data into technology types — e.g. when considering 3G connections, we include HSDPA, HSUPA and UMTS R99 into one group.

Scientific averaging

We calculate a single average per device to ensure every device has an equal effect on the overall result. Essentially, we employ a “one device, one vote” policy in our calculations.

Removing extreme values

We eliminate a percentage of extreme high and low values. This removal of extremes is common data science practice and ensures the average calculated represents typical user experience.

Analytics – Reporting

Per Device Values: Per device values are combined using a simple average to yield the Opensignal metrics that are found in our reports and analysis. 

Confidence Intervals: We provide an upper and lower estimate of confidence interval per operator, calculated using recognized standard techniques based on the sample size of measurements. Confidence intervals provide information on the margins of error or the precision in the metric calculations. They represent the range in which the true value is very likely to be, taking into account the entire range of data measurements.

Statistically significant results: Whenever the confidence intervals for two or more operators overlap in a particular metric, the result is a statistical tie. This is because one operator's apparent lead may not hold true once measurement uncertainty is taken into account. For this reason, awards could have multiple winners in our reports.

Standardized geographical boundaries: A common practice in reports from other sources is to “cherry-pick” geographic boundaries or time periods to be able to make an observation about a specific operator. For example, highlighting performance for a particular area of a city, or over a particular time period. We do not do this and only report on standardized geographical boundaries (where available) and over the entire period covered by the measurements. Our reporting timetable is under our control and not released to operators in advance. This ensures that reports represent the consistent experience of the majority of users.

Analytics – Video Experience

Opensignal’s Video Experience quantifies the quality of video streamed to mobile devices by measuring real-world video streams over an operator's networks. The metric is based on an International Telecommunication Union (ITU) approach, built upon detailed studies which have derived a relationship between technical parameters, including picture quality, video loading time and stall rate, with the perceived video experience as reported by real people.

To calculate video experience, we are directly measuring video streams from end-user devices and using this ITU approach to quantify the overall video experience for each operator on a scale from 0 to 100. The videos tested include a mixture of resolutions — including Full HD (FHD) and 4K / Ultra HD (UHD) — and are streamed directly from the world’s largest video content providers.

The following scale can be used to relate the Video Experience scores to the actual experience our users received:

Excellent (78 or above)

Our users were, on average, able to stream video at 1080p or better with fast loading times and no stalling.

Very Good (68 or more but less than 78)

Our users were, on average, able to stream video at 1080p or better with satisfactory loading times and little stalling.

Good (58 or more but less than 68)

Our users were, on average, able to stream video at 720p or better with satisfactory loading times and little stalling.

Fair (48 or more but less than 58)

Our users were, on average, able to stream video at 720p or better with satisfactory loading times and substantial stalling.

Poor (Under 48)

Our users, on average, encountered very high loading times or high levels of stalling or were only able to stream the video at resolutions below 720p.

Analytics – Live Video Experience

Opensignal’s Live Video Experience quantifies the quality of real-time video streamed to mobile devices by measuring video streams over an operator's network. The metric extends the existing International Telecommunication Union (ITU) approach used for Opensignal's on-demand Video Experience metric, built upon detailed studies which have derived a relationship between technical parameters, including live playback offset, picture quality, video loading time and stall rate, with the perceived live video experience as reported by real people.

To calculate live video experience, we are directly measuring live video streams from end-user devices and using this extension of ITU's approach to quantify the overall live video experience for each operator on a scale from 0 to 100. The videos tested include a mixture of resolutions and are streamed directly from the world’s largest video content providers.

The following scale can be used to relate the Live Video Experience scores to the actual experience our users received:

Excellent (58 or above)

Our users were, on average, able to stream video at least at 1080p with low loading times, little stalling and a satisfactory live offset.

Very Good (53 or more but less than 58)

Our users were, on average, able to stream video at least at 720p or 1080p with low loading times, little stalling and a satisfactory live offset.

Good (43 or more but less than 53)

Our users were, on average, able to stream video at least at 720p with satisfactory loading times, little stalling and a substantial live offset.

Fair (33 or more but less than 43)

Our users were, on average, able to stream video at least at 480p with significant loading times, little stalling and a substantial live offset.

Poor (Under 33)

Our users, on average, encountered very high loading times, stalling or live offset or were only able to stream the video at resolutions below 480p.

Analytics – Games Experience

Opensignal’s Games Experience measures how mobile users experience real-time multiplayer mobile gaming on an operator’s network. Measured on a scale of 0-100, it analyzes how our users’ multiplayer mobile gaming experience is affected by mobile network conditions including latency, packet loss and jitter.

Games Experience quantifies the experience when playing real-time multiplayer mobile games on mobile devices connected to servers located around the world. The approach is built on several years of research quantifying the relationship between technical network parameters and the gaming experience as reported by real mobile users. These parameters include latency (round trip time), jitter (variability of latency) and packet loss (the proportion of data packets that never reach their destination).

Additionally, it considers multiple genres of multiplayer mobile games to measure the average sensitivity to network conditions. The games tested include some of the most popular real-time multiplayer mobile games (such as Fortnite, Pro Evolution Soccer and Arena of Valor) played around the world.

Calculating Games Experience starts with measuring the end-to-end experience from users’ devices to internet end-points that host real games. The score is then measured on a scale from 0 to 100.

Excellent (85 or above)

The vast majority of users deemed this network experience acceptable. Nearly all respondents felt like they had control over the game and they received immediate feedback on their actions. There was not a noticeable delay in almost all cases.

Good (75 or more but less than 85)

Most users deemed the experience acceptable. The gameplay experience was generally controllable and the user received immediate feedback between their actions and the outcomes in the game. Most users did not experience a delay between their actions and the game.

Fair (65 or more but less than 75)

Users found the experience to be ‘average’. In most cases the game was responsive to the actions of the player with most users reporting that they felt like they had control over the game. The majority of players reported that they noticed a delay between their actions and the outcomes in the game.

Poor (40 or more but less than 65)

Most users found this level of experience unacceptable. The majority of users reported seeing a delay in the gameplay experience and they did not receive immediate feedback on their actions. Many users felt a lack of controllability.

Very Poor (Under 40)

Nearly all users found this level of experience unacceptable. Almost all users experienced a noticeable delay within the game, with most of them not feeling like they had control of the gameplay. The vast majority of players didn’t receive immediate feedback on their actions.

Analytics – Voice App Experience

Opensignal's Voice App Experience measures the quality of experience for over-the-top (OTT) voice services — mobile voice apps such as WhatsApp, Skype and Facebook Messenger — using a model derived from the International Telecommunication Union (ITU) approach for quantifying overall voice call quality and a series of calibrated technical parameters.

This model characterizes the exact relationship between the technical measurements and perceived call quality. Voice App Experience for each operator is calculated on a scale from 0 to 100.

Opensignal's Voice App Experience has the following categories:

Excellent (95 or above)

Most users are very satisfied. Operator provides consistently good OTT voice quality experience across the customer base. 

Very Good (87 or more but less than 95)

Most users are satisfied. Operator generally provides good OTT voice quality experience. Occasionally, there may be some impairments to the call, primarily related to level of loudness. 

Good (80 or more but less than 87)

Many users are satisfied. Minor quality impairments experienced by some users. Sometimes the background is not quite clear, it can be either hazy or not loud enough. Clicking sounds or distortion is very occasionally present. 

Acceptable (74 or more but less than 80)

Users are satisfied. Perceptible call quality impairments experienced by some users. Short duration of clicking sounds or distortion can be heard, and/or the volume may not be sufficiently loud. Listener is generally able to comprehend without repetition. 

Poor (66 or more but less than 74)

Many users dissatisfied. Call quality impairments experienced by many users. Distortion, clicking sounds or silence experienced during the call, which is perceptible and can be annoying.

Very Poor (60 or more but less than 66)

Most users dissatisfied. Significant call quality impairments experienced by most users. Occasional instances of distortion, clicking sounds or silence experienced during the call. It can be difficult to understand parts of the conversation without repetition.

Unintelligible (45 or more but less than 60)

Nearly all users are dissatisfied. Frequent instances of long pauses, clicking sounds or distortion can be heard during the call. Frequent repetition is required to be comprehensible, or there are frequent conversation overlaps. 

Impossible to communicate (Under 45)

Impossible to communicate.

Analytics – Group Video Calling Experience

Group Video Calling Experience measures the proportion of video calls where all users on a call had at least an adequate or better video conference experience.

In simple terms, Group Video Calling Experience measures whether all users in a group video call – not just a small number of users – had both sufficient (or better) video and audio quality. It therefore takes into account that a poor experience for one or more users will impact all users on a conference call so having a consistent experience across all users on a group video call is important.

Group Video Calling Experience uses measurements from our real-world video tests and our voice app calling tests. To calculate Group Video Calling Experience, we consider a range of scenarios that reflect typical numbers of call participants displayed during a smartphone video call – two, four and eight participants – to represent the real-world mobile video conference experience. Group Video Calling Experience for each operator is measured on a scale from 0 to 100.

Analytics – Availability

Opensignal's availability metrics are not measures of a network’s geographical extent. They won’t tell you whether you are likely to get a signal if you plan to visit a remote rural or nearly uninhabited region. Instead, they measure what proportion of time people have a network connection, in the places they most commonly frequent — something often missed by traditional coverage metrics. Looking at when users have a connection rather than where, provides us with a more precise reflection of the true user experience. 

We also keep track of the instances that leave mobile users most frustrated: when there is no signal to connect to at all. The most common dead zones users struggle with occur indoors. As most of our availability data is collected indoors (as that’s where users spend most of their time), we’re particularly astute at detecting areas of zero signal.

Opensignal's availability metrics take a user-centric, time-based approach that complements the user-centric and geographical-based methodology used by our reach metrics.

Our Availability metric shows the proportion of time all Opensignal users on an operator’s network had either a 3G, 4G or 5G connection.

4G Availability shows the proportion of time Opensignal users with a 4G device and a 4G subscription — but have never connected to 5G — had a 4G connection.

5G Availability shows the proportion of time Opensignal users with a 5G device and a 5G subscription had an active 5G connection.

Analytics – Reach

Opensignal’s reach metrics measure how mobile users experience the geographical extent of an operator’s network. They analyze the average proportion of locations where users were connected to a network out of all the locations those users have visited. 

In simple terms, reach metrics measure the mobile experience in all the locations that matter most to everyday users — i.e. all the places where they live, work and travel. Our reach metrics for each operator are measured on a scale from 0 to 10.

Opensignal’s reach metrics provide a user-centric, geographical-based measure to complement the user-centric, time-based measure provided by availability metrics.

5G Reach measures how users experience the geographical extent of an operator’s 5G network. It analyzes the average proportion of locations where users were connected to a 5G network out of all the locations those users have visited. In simple terms, 5G Reach measures the 5G mobile experience in all the locations that matter most to everyday users – i.e. all the places where they live, work and travel. 5G Reach for each operator is measured on a scale from 0 to 10.

Analytics – Coverage Experience

The Opensignal Coverage Experience metric measures the extent of mobile networks in the places people live, work and travel. The metric represents the experience users receive as they travel around areas where they would reasonably expect to find coverage.

Traditional coverage metrics typically estimate either a percentage of land area covered, or a percentage of population covered; often neither will be an accurate measurement of the true user expectation and experience. In many markets there are areas where neither population density nor geographic area reflect the importance of coverage to users.

For example, in a large mountain range most users will not expect coverage in the wilderness, but poor coverage in the relatively small area of a ski resort is critical for the enjoyment of a holiday. Estimates based purely on population give undue significance to coverage in the most densely populated areas.

Coverage Experience measures geographic coverage of populated areas and therefore more accurately reflects the coverage expectations and experience of typical users. It can give a result that is somewhat different to traditional estimates based on either geographic or population measures. The metric uses a scale from 0 to 10.

Analytics – Consistent Quality

Consistent Quality measures if the network is sufficient to support common mobile application requirements at a level that is ‘good enough’ for users to maintain (or complete) various typical tasks on their devices.

We combine different experience indicators such as download speed, upload speed, latency, jitter, packet discard, and time to first byte to calculate Consistent Quality. These components are evaluated against thresholds recommended by various more demanding common applications used for a range of common tasks.

To calculate the metric value, the proportion of tests that pass the requirements of Consistent Quality is multiplied by the test success ratio, which is the proportion of completed tests to all tests conducted. Tests that pass indicate that activities such as video calling, uploading an image to social media, or using smart home applications will be possible without noticeable lag or slowdown.

Analytics – Reliability Experience

Opensignal’s Reliability Experience measures the ability of Opensignal users to connect to and successfully complete (basic) tasks on communication service providers’ (CSP) networks. It analyzes how much Opensignal users’ experience is affected by the radio access and core network, along with issues that prevent them from connecting to the internet even if they have a connection to their CSP’s network. It also factors in users’ ability to successfully use lower performance applications including SD video, over-the-top voice calls and web browsing.

It does not test standard voice calls performed directly through CSPs’ networks — i.e. those that are carried out via smartphones’ in-built call functionality.

Calculated on a scale of 100-1000 — with higher scores being better — across all generations of mobile technology, Reliability consists of the following components:

  • % time connected — The proportion of time Opensignal users can successfully connect to a mobile network (if appropriate)
  • (Data) Connectivity — the proportion of time when the network is available and the device can connect to the internet
  • Task completion — whether tasks initiated by the user’s device are completed
  • Sufficiency — The probability that (basic) tasks will be executed sufficiently well for the user

Summary

Our measurements are designed to capture as accurately as possible the experience of typical real users and are subject to all the factors that affect real user traffic, representing a wide base of users and devices.

Our scientific analysis processes these measurements to create the most accurate possible picture of user experience and how it varies between operators, regions and countries, as far as possible based only on measurements from real users.

Our reports present our findings objectively, with confidence intervals shown and only drawing statistically significant conclusion.

Because of this, our reports are recognized as the most trusted source of mobile network experience around the world.                    

Learn more about the principles that govern our business in the Opensignal Manifesto.