Measuring Carbon Data Efficiently and Effectively

There is widespread recognition that the path to Net Zero starts with baselining greenhouse gas emissions, supported by a clear understanding of the estate and the constituent assets; afterall you can’t manage what you don’t measure.

The problem however is that measurement of both embodied and operational carbon is not straightforward and the view being formed is that the UK government is failing to lead the way on baselining carbon and cutting emissions. The Public Accounts committee recently said, “across Whitehall departments the rules for reporting, gathering data and taking action are vague, there is a free-for-all on reporting emissions reductions or the lack of them, and oversight is fragmented and ineffective.” Committee Chair Meg Hillier said: “Vague guidance and lack of follow-up make it hard for the public to hold the government to account. A free-for-all on reporting veils progress or lack of it. Government needs to be clearer and must publish consistent standards for measuring and reporting emissions across the public sector so that it can be properly held to account.”

 700+ projects with a capital value of approximately £50bn and a lifetime value of £290bn

How could this be addressed?

In our CURSHAW view the following are fundamental:

  1. Clearer guidance - The Net Zero Playbook is an invaluable resource but it needs to be simplified, in order for more systematic reporting of higher integrity data. This would allow the BEIS-reported public sector emissions from buildings to be split into different Whitehall departments for benchmarking purposes. For now it seems that people responsible for baselining carbon have no option but to navigate a series of pdf guides with all roads leading to the 2009 DEFRA guidance on how to measure and report greenhouse gas emissions. Should an updated version be produced, recognising the developments in technology, standards and measurement techniques that have occurred during the intervening 13 years?

  2. Alignment with industry recognised standards - Without a consistent approach to data collection it is not possible to make meaningful comparisons across portfolios. Key to achieving consistency is collecting data by reference to an industry recognised standard. At CURSHAW we have seen first hand that organisations are using mechanisms to baseline carbon emissions that are not linked to industry recognised standards like UKGBC or Science-based targets, making it difficult for data collection exercises and subsequent pathways to net zero to be audited. Right now BBP, BRE, the Carbon Trust, CIBSE, IStructE, LETI, RIBA, RICS, and UKGBC have joined forces to establish the UK's first Net Zero Carbon Buildings Standard. It promises to, “set out metrics by which net zero carbon performance is evaluated, as well as performance targets, or limits, that need to be met. These are likely to include energy use, upfront embodied carbon, and lifecycle embodied carbon, with other metrics – such as space heating/cooling demand and peak load – also to be considered.” To address the criticism of the Public Accounts Committee we consider the adoption of standards, or better yet a singular standard, for the public sector estate to be essential. The obvious leading contender is the forthcoming UK Net Zero Carbon Buildings Standard.

  3. Workflow that is linked to a standard and the Government conversion factors for calculation of greenhouse gas emissions. Would it be a step too far to engineer workflow i.e. an orchestrated and repeatable pattern of activity in a system solution, to support people through the process of baselining carbon emissions, make the process itself more efficient and the data more accurate? At the very least, the same or similar sets of questions should be adopted and applied across all portfolios that are being benchmarked. The question set should be automatically linked to the BEIS/DEFRA conversion factors to remove potential error involved in manual calculations. Automation is also key to ensuring that the correct year’s conversion factors are applied, given that both the conversion factors and the data to which they are applied change from year to year.

  4. Data governance is essential - People like a visual representation of data, and attach excess value to dashboards and data objects that can be playfully manipulated, but sadly care less for data governance and the control points that are essential to collecting high integrity data. The risk is a rush to collect the data for the sake of creating a pretty picture, only to make inferences based on flawed inputs. Those responsible for stewardship of baselining greenhouse gas emissions, therefore, need to concern themselves with who provides the data, where it comes from, whether evidence is required to corroborate entry of values, how the data is collected; building occupiers and indeed Facilities Management Field Teams often do not have access to the data, and may need to liaise with subcontractors or other third parties. It is also essential that when data is collected it is done so in such a way that there are singular returns, from known and named persons per property, with no possible confusion over versions. It is equally important to consider how error can be managed out - the most efficient way being to leverage systems that validate data entry at source. Finally, those leading data collection exercises also need to think about the aggregation of data within and across portfolios. Where possible this should also be done in a systems environment to avoid manual handling of data. Using a singular workflow or question set with a common data taxonomy is also key.

  5. Linking property and project emissions to base asset data - Where possible, automating meter readings and linking meters to sub-meters and the related assets will enable invaluable insight into the individual assets that drive operational emissions. This enables asset management strategies to be developed to accurately target and achieve net zero. We are yet to see any organisation that is both measuring its greenhouse gas emissions and linking those emissions to underlying asset data.

  6. Avoiding spreadsheets - Using 20th century tools for a 21st century problem is wrong. We can only speculate as to why spreadsheets are currently in use -  could it be that we are all only at the beginning of journey to net zero, or could it be that those embarking on data collection are unwilling to fund more sophisticated collection methods? What’s clear to us is that the approach taken by a number of organisations is not sustainable. Spreadsheet-based approaches, are initially cheaper, prima facie, precisely because they are not software systems (which involve complex architecture and infrastructure). The spreadsheet-based approaches neither have nor involve or enable:

    A) Automation i.e. automatic issuance of workflow/question sets to predetermined respondents, automatic issuances of responses to predetermined approvers and automatic population of data lakes and ultimately population of presentation layer (Microsoft Power BI/Tableau) data object

    B) Validation at source i.e there is nothing to stop error at the point of entry

    C) Control at the point of entry. It is much harder to restrict who enters data.

    D) Version control. There will be multiple versions of the same response.

    E) Consolidation and data aggregation across properties and projects within portfolios. This is a manual exercise.

    F) Secure transfer to central government oversight bodies. This is a manual process that either requires information to be processed over email or via government secure servers.

    G) Flexibility or scalability with no ability to add or adapt workflow and questions or add more data attributes.

    H) Consistency - a fixed question set for all projects that cannot be manipulated by respondents. Benchmarking is only possible if the same data is collected across properties and projects within portfolios.

    I) The ability to input that data into other cost modelling, BIM tools for remediation or asset management plans.

    Although initially cheaper, using spreadsheets as a methodology for data collection across multiple projects is inefficient and ineffective with added full-life costs (including opportunity costs) in the form of manual validation, version control, error rectification, handling and consolidation and analysis, the spreadsheet approach is likely to produce significantly poorer results at a significantly higher cost than the initial face value figures would suggest. 

Previous
Previous

PFI Expiry - a critical dependency