Skip to main content

Experience Charter

Opensignal’s analysis is designed to measure the true experience of the end users as accurately as possible.

We compare networks based on measures that matter to customers, not abstract engineering parameters.

Opensignal performs the most representative analysis possible to ensure the experience of the broadest base of users are reflected equally in our analysis. Our tests are carefully designed to accurately reflect the users experience and exclude sources of bias and errors.

We believe these analysis policies, when combined with the data science techniques laid out in our Analytics Charter, lead the industry and produce the most accurate reports and insights which reveal the true experience.

  • Consistent methodology: Opensignal’s analysis methodology has been developed independently over several years and examined by many industry stakeholders. Fundamental to our approach is that we never change our methodology to suit the needs of a particular country or operator. Our methodology only changes as we make improvements which are consistently applied. By doing this we ensure that global comparisons can be made.

  • Typical experience: Our tests are designed to represent the experience a typical user will receive. Unlike other methodologies, we are not seeking to represent the “best-case” performance of the network or what a certain subset of optimized users would get. We are not looking to represent what the network is capable of under certain conditions or in the absence of real-world users. In designing our methodology at every step, we asked ourselves how to represent what an average user is likely to experience

  • Wide range of real devices: We endeavor to analyze the widest range of devices possible, not just the latest devices used by a minority of users. This mix of devices means our analytics output truly measures the real-world experience because we perform analysis from the full range of modems, chipsets, device brands and smartphone models used in the real world. We believe the impact of the device is a key element of the measuring experience. We quality assure the analysis to account for specific device or software issues.

  • Automated random testing: The majority of our analysis is derived from tests that are carried out automatically and at random without user intervention. This approach is endorsed by official bodies such as the FCC and produces quite different results to analysis generated where a user manually initiates the test.

  • Device to real-world endpoint testing: Our active tests measure the true end-to-end experience by using endpoints that are representative of real user services. We use multiple Content Delivery Networks and video content providers, and these are regularly reviewed to ensure they reflect typical user traffic. We do not allow operators to deploy dedicated endpoint servers and we design our tests in such a way that prevents operators from identifying and optimizing our test traffic in order to record higher levels of performance.

  • Speed measures real sustained speed: Our tests of network speed used in our “Speed Experience” measures are designed to measure the speed a user will experience when using a speed-sensitive service such as streaming or large file download. 

  • Video measures real video performance: Our video test streams real content from providers selected to represent typical global traffic and measure factors that directly impact users. We do not extrapolate from general speed measures. This ensures that traffic management applied only to video is accurately reflected.

  • Broad Sample Base: To ensure our results represent the broadest base of users we work with a wide range of partner apps which represent a range of users from a broad range of demographics. Our analytics ensure that we are representing the widest user base possible.