Understanding mobile network experience: What do Opensignal's metrics mean?

In this blog post, we’re taking a deep dive into the very core of Opensignal: explaining the metrics we use, what they all mean and what their roles are in measuring the real-world mobile network experience as users see it.

How do we collect the data in the first place?

We collect billions of individual measurements every day from over 100 million devices worldwide. We collect data every day of the week, at all hours and in all the places people live, work and travel: no simulations, no predictions, no idealized testing conditions. Our data comes from actual smartphone users and we report users’ actual network experience, whether they are indoors or out, bustling in a busy city or trekking in the countryside.

We collect the vast majority of our data via automated tests that run in the background, enabling us to report on users’ real-world mobile experience at the largest scale and frequency in the industry. These automated tests are run at random points in time and therefore represent the typical experience available to a user at any given moment.

 

Video Experience – Measuring real-world mobile video streams

Opensignal’s Video Experience quantifies the quality of video streamed to mobile devices by measuring real-world video streams over an operator's networks. The metric is based on an International Telecommunication Union (ITU) approach, built upon detailed studies which have derived a relationship between technical parameters, including picture quality, video loading time and stall rate, with the perceived video experience as reported by real people. To calculate video experience, we are directly measuring video streams from end-user devices and using this ITU approach to quantify the overall video experience for each operator on a scale from 0 to 100. The videos tested include a mixture of resolutions — including Full HD (FHD) and 4K / Ultra HD (UHD) — and are streamed directly from the world’s largest video content providers.

The following scale can be used to relate the Video Experience scores to the actual experience our users received:

  • Excellent (75 or above): Very consistent experience across all users, video streaming providers and resolutions tested, with fast loading times and almost non-existent stalling.
  • Very Good (65 or more but less than 75):  Generally fast loading times and only occasional stalling but the experience might have been somewhat inconsistent across users and/or video providers/resolutions.
  • Good (55 or more but less than 65): An acceptable but inconsistent experience, even from the same video streaming provider and particularly for higher resolutions, with noticeably slow loading times and stalling not being uncommon.
  • Fair (40 or more but less than 55):  Not a good experience either for higher resolution videos (very slow loading times and prolonged stalling) or for some video streaming providers. The experience on lower resolution videos from some providers might have been sufficient though.
  • Poor (under 40):  Not a good experience even for lower resolution videos across all providers. Very slow loading times and frequent stalling were common.

With video being the  single largest category of traffic  carried on mobile networks and consumption expected to grow to keep pace with consumer demand, this is an extremely relevant metric. It is another key element in Opensignal’s ongoing mission to measure the actual consumer experience on mobile networks.

In addition to Video Experience, we report on the following metrics related to video experience:

  • 5G Video Experience: The average Video Experience of Opensignal users on an operator's 5G network. 
  • Video Experience – 5G Users: The average Video Experience of Opensignal users with a 5G device and a 5G subscription across an operator's networks. It factors in 2G, 3G, 4G and 5G video experience along with the availability of each technology.
  • 4G Video Experience: The average Video Experience of Opensignal users on an operator's 4G network.
  • 3G Video Experience: The average Video Experience of Opensignal users on an operator’s 3G network.

 

Games Experience – Measuring real-time multiplayer mobile gaming

Opensignal’s Games Experience measures how mobile users experience real-time multiplayer mobile gaming on an operator’s network. Measured on a scale of 0-100, it analyzes how our users’ multiplayer mobile gaming experience is affected by mobile network conditions including latency, packet loss and jitter.

Games Experience quantifies the experience when playing real-time multiplayer mobile games on mobile devices connected to servers located around the world. The approach is built on several years of research quantifying the relationship between technical network parameters and the gaming experience as reported by real mobile users. These parameters include latency (round trip time), jitter (variability of latency) and packet loss (the proportion of data packets that never reach their destination). Additionally, it considers multiple genres of multiplayer mobile games to measure the average sensitivity to network conditions. The games tested include some of the most popular real-time multiplayer mobile games (such as Fortnite, Pro Evolution Soccer and Arena of Valor) played around the world. 

Calculating Games Experience starts with measuring the end-to-end experience from users’ devices to internet end-points that host real games. The score is then measured on a scale from 0 to 100.

The Opensignal mobile Games Experience has the following categories:

  • Excellent (85 or above): The vast majority of users deemed this network experience acceptable. Nearly all respondents felt like they had control over the game and they received immediate feedback on their actions. There was not a noticeable delay in almost all cases.
  • Good (75 or more but less than 85): Most users deemed the experience acceptable. The gameplay experience was generally controllable and the user received immediate feedback between their actions and the outcomes in the game. Most users did not experience a delay between their actions and the game.
  • Fair (65 or more but less than 75): Users found the experience to be ‘average’. In most cases the game was responsive to the actions of the player with most users reporting that they felt like they had control over the game. The majority of players reported that they noticed a delay between their actions and the outcomes in the game.
  • Poor (40 or more but less than 65): Most users found this level of experience unacceptable. The majority of users reported seeing a delay in the gameplay experience and they did not receive immediate feedback on their actions. Many users felt a lack of controllability.
  • Very Poor (Under 40): Nearly all users found this level of experience unacceptable. Almost all users experienced a noticeable delay within the game, with most of them not feeling like they had control of the gameplay. The vast majority of players didn’t receive immediate feedback on their actions.

In addition to Games Experience, we report on the following metrics related to games experience:

  • 5G Games Experience: The average Games Experience of Opensignal users when they were connected to an operator’s 5G network.
  • Games Experience – 5G Users: The average Games Experience of Opensignal users with a 5G device and a 5G subscription across an operator's networks. It factors in 2G, 3G, 4G and 5G games experience along with the availability of each technology.
  • 4G Games Experience: The average Games Experience of Opensignal users on an operator's 4G network.
  • 3G Games Experience: The average Games Experience score across an operator’s 3G connections (e.g. UMTS/HSPA or CDMA 1X EV-DO).


Voice App Experience – Quantifying voice quality

Opensignal's Voice App Experience measures the quality of experience for over-the-top (OTT) voice services — mobile voice apps such as WhatsApp, Skype and Facebook Messenger — using a model derived from the International Telecommunication Union (ITU) approach for quantifying overall voice call quality and a series of calibrated technical parameters. This model characterizes the exact relationship between the technical measurements and perceived call quality. Voice App Experience for each operator is calculated on a scale from 0 to 100.

The following scale can be used to relate the Voice App Experience scores to the actual experience our users received:

  • Excellent (95 or above): Most users were very satisfied and enjoyed a consistently good OTT voice quality experience.
  • Very Good (87 or more but less than 95): Most users were satisfied and enjoyed a generally good OTT voice quality experience. Occasionally, there may have been some impairments to the call, primarily related to the level of loudness. 
  • Good (80 or more but less than 87): Many users were satisfied. Minor quality impairments were experienced by some users. Sometimes the background was not quite clear, it could have been either hazy or not loud enough. Clicking sounds or distortion were very rarely present. 
  • Acceptable (74 or more but less than 80): Some users were satisfied. Perceptible call quality impairments were experienced by some users. Clicking sounds of short duration or distortion were heard, and/or the volume may not have been sufficiently loud. Listeners were generally able to comprehend without repetition.  
  • Poor (66 or more but less than 74): Many users were dissatisfied. Call quality impairments were experienced by many users. Distortion, clicking sounds or silence were experienced during the call. These were perceptible and may have been annoying.
  • Very Poor (60 or more but less than 66): Most users were dissatisfied. Significant call quality impairments were experienced by most users. Occasional instances of distortion, clicking sounds or silence were experienced during the call. It may have been difficult to understand parts of the conversation without repetition.
  • Unintelligible (45 or more but less than 60): Nearly all users were dissatisfied. Frequent instances of long pauses, clicking sounds or distortion may have been heard during the call. Frequent repetition was required to be comprehensible, or there were frequent conversation overlaps. 
  • Impossible to communicate (Under 45) 

In addition to Voice App Experience, we report on the following metrics related to voice apps:

  • 5G Voice App Experience: The average Voice App Experience of Opensignal users when they were connected to an operator’s 5G network.
  • Voice App Experience – 5G Users: The average Voice App Experience of Opensignal users with a 5G device and a 5G subscription across an operator's networks. It factors in 2G, 3G, 4G and 5G voice app experience along with the availability of each technology.
  • 4G Voice App Experience: The average Voice App Experience of Opensignal users on an operator's 4G network.
  • 3G Voice App Experience: The average Voice App Experience of Opensignal users on an operator's 3G (e.g. UMTS/HSPA or CDMA 1X EV-DO) network.

 

Group Video Calling Experience – Measuring the video conference experience

Opensignal's Group Video Calling Experience measures the proportion of video calls where all users on a call had at least an adequate or better video conference experience. In simple terms, Group Video Calling Experience measures whether all users in a group video call — not just a small number of users — had both sufficient (or better) video and audio quality. It therefore takes into account that a poor experience for one or more users will impact all users on a conference call so having a consistent experience across all users on a group video call is important.

Group Video Calling Experience uses measurements from our real-world video tests and our voice app calling tests. To calculate Group Video Calling Experience, we consider a range of scenarios that reflect typical numbers of call participants displayed during a smartphone video call – two, four and eight participants – to represent the real-world mobile video conference experience. Group Video Calling Experience for each operator is measured on a scale from 0 to 100.

 

Download Speed Experience – The real speeds users get

Measured in Mbps, Opensignal's Download Speed Experience represents the typical everyday speeds a user experiences across an operator’s mobile data networks.

In addition to Download Speed Experience, we report on the following metrics related to download speeds:

  • 5G Download Speed: The average download speed observed by Opensignal users with active 5G connections.
  • Download Speed Experience - 5G Users: The average download speeds experienced by Opensignal users with a 5G device and a 5G subscription across an operator’s networks. It factors in 2G, 3G, 4G, and 5G download speeds along with the availability of each technology.
  • 4G Download Speed: The average downlink speed observed by Opensignal users when they were connected to 4G.
  • 3G Download Speed: The average downlink speed observed by Opensignal users when they were connected to 3G (e.g. UMTS/HSPA or CDMA 1X EV-DO).

 

Upload Speed Experience – Measuring the upstream capacity

Opensignal's Upload Speed Experience measures the average upload speeds for each operator observed by our users across their mobile data networks. Typically upload speeds are slower than download speeds, as current mobile broadband technologies focus resources on providing the best possible download speed for users consuming content on their devices. As mobile internet trends move away from downloading content to creating content and supporting real-time communications services, upload speeds are becoming more vital and new technologies are emerging that boost upstream capacity.

In addition to Upload Speed Experience, we report on the following metrics related to upload speeds:

  • 5G Upload Speed: The average upload speed observed by Opensignal users with active 5G connections.
  • Upload Speed Experience – 5G Users: The average upload speeds experienced by Opensignal users with a 5G device and a 5G subscription across an operator’s networks. It factors in 2G, 3G, 4G, and 5G upload speeds along with the availability of each technology.
  • 4G Upload Speed: The average uplink speed observed by Opensignal users when they were connected to 4G.
  • 3G Upload Speed: The average uplink speed observed by Opensignal users when they were connected to 3G (e.g. UMTS/HSPA or CDMA 1X EV-DO).

 

Availability – A user-centric approach

Opensignal's availability metrics are not a measure of a network’s geographical extent. They won’t tell you whether you are likely to get a signal if you plan to visit a remote rural or nearly uninhabited region. Instead, they measure what proportion of time people have a network connection, in the places they most commonly frequent — something often missed by traditional coverage metrics. Looking at when users have a connection rather than where, provides us with a more precise reflection of the true user experience. 

We also keep track of the instances that leave mobile users most frustrated: when there is no signal to connect to at all. The most common dead zones users struggle with occur indoors. As most of our availability data is collected indoors (as that’s where users spend most of their time), we’re particularly astute at detecting areas of zero signal.

Our availability metrics take a user-centric, time-based approach that complements the user-centric and geographical-based methodology used by our reach metrics. We report on three supporting metrics related to availability:

  • Availability: The proportion of time all Opensignal users on an operator’s network had either a 3G, 4G or 5G connection. 
  • 5G Availability: The proportion of time Opensignal users with a 5G device and a 5G subscription had an active 5G connection.
  • 4G Availability: The proportion of time Opensignal users with a 4G device and a 4G subscription — but have never connected to 5G — had a 4G connection.

 

Reach – Measuring geographical extent

Opensignal’s reach metrics measure how mobile users experience the geographical extent of an operator’s network. They analyze the average proportion of locations where users were connected to a network out of all the locations those users have visited. 

In simple terms, reach metrics measure the mobile experience in all the locations that matter most to everyday users — i.e. all the places where they live, work and travel. Our reach metrics for each operator are measured on a scale from 0 to 10.

Opensignal’s reach metrics provide a user-centric, geographical-based measure to complement the user-centric, time-based measure provided by availability metrics.

5G Reach measures how users experience the geographical extent of an operator’s 5G network. It analyzes the average proportion of locations where users were connected to a 5G network out of all the locations those users have visited. In simple terms, 5G Reach measures the 5G mobile experience in all the locations that matter most to everyday users – i.e. all the places where they live, work and travel. 5G Reach for each operator is measured on a scale from 0 to 10.

 

4G Coverage Experience – Measuring the 4G user experience

4G Coverage Experience measures how mobile subscribers experience 4G coverage on an operator’s network. Measured on a scale of 0-10, it analyzes the locations where customers of a network operator received a 4G signal relative to the locations visited by users of all network operators. In simple terms, 4G Coverage Experience measures the mobile coverage experience in all the locations that matter most to everyday users — i.e. all the places where they live, work and travel. It considers all the areas that Opensignal users visit, the portion of locations that 4G is available to them, and locations that more users visit have higher importance to them.

 

Consistent Quality – Measuring users' network experience

Consistent Quality measures how often users’ experience on a network was sufficient to support common applications’ requirements. It measures download speed, upload speed, latency, jitter, packet loss, time to first byte and the percentage of tests attempted which did not succeed due to a connectivity issue on either the download or server response component.

The Consistent Quality metrics – Excellent Consistent Quality and Core Consistent Quality - are calculated using data from our sister company Tutela and uses its methodology, full details of which can be found here.

  • Excellent Consistent Quality: The percentage of users' tests that met the minimum recommended thresholds for users to watch HD video, complete group video conference calls and play games. 
  • Core Consistent Quality: The percentage of users' tests that met the minimum recommended performance thresholds for lower performance applications including SD video, voice calls and web browsing. 


Feel like checking the numbers for yourself? Download our Opensignal or Meteor app. Both are available on Android and iOS!