Skip to main content

Experience Charter

Opensignal’s analysis is designed to measure the true network experience of the end users as accurately as possible. We compare networks based on measures that matter to customers, not abstract engineering parameters.

Opensignal performs the most representative analysis possible to ensure the experience reflects the broadest base equally in our analysis. Our tests are carefully designed to accurately reflect network experience and exclude sources of bias and errors.

Experience insights are based on passive, real world usage

Designed to reflect typical end user experiences

Our tests are designed to represent the network experience a consumer will receive.

We do not seek to represent the “best-case” performance of the network or the experience of a subset of optimized users. We represent what the network is capable of under normal conditions.


Capture a range of active device types currently in use

We endeavor to analyze the widest range of devices possible, not just the latest devices used by a minority of users.

This mix of devices means our analytics output truly measures the real-world experience because we perform analysis from the full range of modems, chipsets, device brands and smartphone models used in the real world, since the impact of the device is a key element of the measuring experience.


Sampling covers a broad and widely represented base

The connectivity services and features that we offer to mobile publishers provides us with a diverse view of connected use cases, regions, and demographics from research participants. Our analytics ensure that we represent the widest user base possible.

Crowdsourced data is passively collected

Tests are random and automated

The majority of our analysis is derived from tests that are carried out automatically and at random without user intervention or initiation. This approach is endorsed by official bodies such as the FCC in the US.


Testing is conducted from device to real-world endpoint

Our active tests measure the true end-to-end experience by using endpoints that are representative of real user services.

We use multiple Content Delivery Networks (CDNs) and video content providers, and these are regularly reviewed to ensure they reflect typical user traffic. Operators do not deploy dedicated endpoint servers for us, and we design our tests in such a way that prevents operators from identifying and optimizing our test traffic in order to record higher levels of performance.

Speed and video tests measure actual experiences

Our tests of network speed used in our “Speed Experience” measures are designed to measure the speed a user will experience when using a speed-sensitive service such as streaming or large file download.

Similarly our video tests stream real content from providers selected to represent typical global traffic and measure factors that directly impact users. We do not extrapolate from general speed measures. This ensures that traffic management applied only to video is accurately reflected.