Our Testing Methods

How do we calculate carbon emissions?

To estimate the carbon emissions associated with loading web content, we utilise the co2.js library, developed by The Green Web Foundation.

Our calculation model:

We use the Sustainable Web Design model provided by co2.js for our calculations. This model is sophisticated, taking into account the average energy mix used across the internet and the specific carbon intensity of generating electricity. The model factors in the energy sources powering data centres and the efficiency of networks transmitting data, providing a realistic estimate of the environmental impact associated with digital data transmission.

Key points in our calculation methodology:

  • Data transfer and energy use: The model estimates the energy used to transfer data over the internet. This includes not just the final mile to your device but also the energy consumed by the servers and intermediate networks.
  • Renewable energy consideration: Adjustments are made based on the proportion of renewable energy used in the data centres and networks that serve the content, reflecting the shift towards greener web hosting solutions.
  • Geographic specificity: The model adjusts carbon intensity based on the geographic location of data centres and the prevailing energy mix in those regions, providing a more localised assessment of carbon emissions.

Why the Sustainable Web Design model?

  • Accuracy: By accounting for variations in electricity generation in different regions and including renewable energy usage, this model provides a nuanced estimate of CO2 emissions.
  • Context-aware: Adjustments are made for the increasing use of renewable energy and improvements in technology in data centres and network infrastructure, reflecting more current environmental impacts.

The co2.js library and its Sustainable Web Design model help us guide website owners in understanding and potentially reducing their digital carbon footprint. This aligns with our mission to promote environmentally responsible web practices, contributing to a more sustainable internet ecosystem.

How we test website size & loading time

We use the WebPageTest API to gather data on page size and loading speed, leveraging this premier tool in web performance testing. Here’s how we utilise WebPageTest to capture essential metrics such as page size and speed, guaranteeing the precision and dependability of our audits.

Choosing WebPageTest for accurate measurements

WebPageTest is renowned for its comprehensive and customisable testing environments. It allows us to simulate how a website performs across different geographic locations, browsers, and network conditions. By using WebPageTest, we can gather detailed insights into several performance metrics that are crucial for understanding the efficiency and impact of a website.

Our testing configuration:

  • Location: We conduct our tests from the London location. This choice is strategic, ensuring that our data reflects the performance metrics for users accessing the sites from this significant internet hub.
  • Browser: We use Google Chrome for all tests. Chrome’s widespread usage and its role as a benchmark for web development practices make it the ideal choice for our tests.
  • Number of tests: To ensure the reliability of our results, we run each test three times. This approach helps in averaging out any anomalies and provides a more consistent and accurate reading of a website’s performance.

Testing process:

Initiating the test: Using WebPageTest’s API, we initiate tests by sending a request that specifies the URL of the website to be tested, along with our testing parameters (location, browser, and the number of runs). This is done through a simple API call where we include our specific requirements in the URL request.

Data collection: Once the test is complete, WebPageTest provides us with a JSON-formatted result that includes various performance metrics. Key metrics we focus on include:

  • Page size: The total size of the page measured in bytes, which is a critical factor in understanding the data footprint of a website.
  • Loading time: The total time it takes for the page to become fully interactive, which helps gauge the effectiveness of the site’s front-end architecture and hosting capabilities.

Analysis and reporting: After collecting the data, we analyse the results to identify opportunities for optimisation and to benchmark the website against industry standards. This analysis helps in providing actionable insights that can significantly improve a website’s performance and sustainability.

Benefits of multiple test runs:

Running the test three times allows us to mitigate variables that could affect the accuracy of a single test. This might include network fluctuations or temporary server performance dips. By averaging the results of three tests, we ensure that our data reflects a more accurate representation of the website’s typical performance.


All the data we provide are estimates, carefully calculated to reflect real-world conditions as accurately as possible. We understand the importance of precision in our assessments and continuously refine our methodologies to ensure that our estimates align closely with actual performance and environmental impact. By integrating the latest research and technology, and by adhering to industry best practices, we strive to deliver data that you can trust for making informed decisions about your digital presence.