4,091
Views
5
CrossRef citations to date
0
Altmetric
Synthesis Article

Assessing state compliance with multilateral climate transparency requirements: ‘Transparency Adherence Indices’ and their research and policy implications

ORCID Icon & ORCID Icon
Pages 635-651 | Received 04 Jul 2020, Accepted 19 Feb 2021, Published online: 02 Apr 2021

ABSTRACT

Transparency is increasingly central to multilateral climate governance. In this article, we undertake one of the first systematic assessments of the nature and extent of compliance with transparency requirements under the United Nations Framework Convention on Climate Change (UNFCCC). Extensive resources are now being devoted to setting up national and international transparency systems that aim to render visible what individual countries are doing with regard to climate change. It is widely assumed that such transparency is vital to securing accountability, trust and thereby also enhanced climate actions from all. Yet, whether transparency lives up to this transformative promise remains largely unexamined. We generate a first systematic overview here of the nature and extent of country engagement with and adherence to UNFCCC transparency requirements. Drawing on extensive primary documents, including national reports and technical expert assessments of these reports, we generate ‘Transparency Adherence Indices’ for developed and developing country Parties to the UNFCCC. Our results reveal wide variations in adherence to mandatory reporting requirements and no clear general pattern of improvement since 2014. Our Indices help to illustrate trends and highlight knowledge gaps around the observed adherence patterns. This is timely since the 2015 Paris Agreement calls for an ‘enhanced transparency framework’ to be implemented by 2024 that builds on existing UNFCCC transparency systems. We conclude with identifying a research and policy agenda to help explain observed patterns of adherence, and emphasize the need for continued scrutiny of assumed links between transparency and climate action.

Key policy insights

  • The UNFCCC and its 2015 Paris Agreement call for ever greater climate transparency from all countries.

  • We develop ‘Transparency Adherence Indices’ that reveal frequency of engagement and adherence to reporting requirements of both developed and developing countries.

  • Our findings reveal high engagement in transparency arrangements by developed countries and variable engagement from developing countries.

  • We question what variable adherence to reporting requirements actually signifies, given how such adherence is assessed within UNFCCC technical expert reports.

  • Further research to explain the range of observed adherence patterns is important, in light of the Paris Agreement’s requirements for enhanced transparency from all.

  • The assumed link between enhanced transparency and climate action needs further analysis.

1. Introduction

The Paris Agreement was adopted to great acclaim by governments in 2015. It requires all states who have ratified it – Parties to the agreement – to specify voluntary climate commitments, in line with their national circumstances and the level of ambition they see fit (UNFCCC, Citation2015). One of the few legally binding obligations in the Paris Agreement is for Parties to be transparent to each other about their progress in meeting their voluntary commitments. Transparency is thus increasingly central to multilateral climate governance. In recent years, extensive financial and human resources have been devoted to setting up national and international transparency systems to render visible what individual countries are doing. The assumption is that such transparency is vital to keeping countries informed about each other’s climate intentions and actions, thereby enhancing accountability, mutual trust and ultimately, more ambitious climate actions from all (e.g. CEW, Citation2018). Yet, whether transparency lives up to this transformative promise remains an open question (Fox, Citation2007; Gupta, Citation2010; Gupta & van Asselt, Citation2019; Karlsson-Vinkhuyzen et al., Citation2018).

Current transparency arrangements under the United Nations Framework Convention on Climate Change (UNFCCC) consist of reporting and review cycles that link national and international levels. Countries are called upon to regularly report on, inter alia, their greenhouse gas (GHG) emissions and the actions taken to reduce those emissions and to adapt to the impacts of climate change. Developed countries are also required to report on international support – financial and technical – provided to developing countries to help the latter implement the Convention.

Yet, what are these elaborate transparency arrangements delivering? There is mainly anecdotal evidence and grey literature examining this so far (e.g. Briner & Moarif, Citation2016; Dagnet et al., Citation2017; Elliott et al., Citation2017; Niederberger & Kimble, Citation2011). Even the nature and extent of engagement with UNFCCC transparency processes remains little examined (exceptions include Briner & Moarif, Citation2016; Ellis & Moarif, Citation2015; Pulles, Citation2017). There remains an implicit or explicit assumption that UNFCCC transparency will help to stimulate enhanced climate action and increased ambition (see e.g. Deprez et al., Citation2015; Falkner, Citation2016; Jacquet & Jamieson, Citation2016; Streck et al., Citation2016; for a cautionary tale, see e.g. Weikmans et al., Citation2020). However, it is difficult to meaningfully assess these oft-assumed links between transparency, accountability, trust and ambition, without knowing if countries are actually being transparent – i.e. whether they are participating in UNFCCC reporting cycles and adhering to reporting requirements.

In this article, we undertake one of the first systematic assessments of country engagement with current UNFCCC transparency arrangements. Our focus is on the most recent processes, agreed in 2010 and being implemented since 2014. These transparency processes require countries to generate Biennial (Update) Reports, which are then subject to an internationally-coordinated process of technical expert review and/or analysis. In assessing engagement with these transparency processes, we identify patterns of participation, including frequency of reporting and adherence to reporting requirements. We capture these patterns in ‘Transparency Adherence Indices’ that we develop. These Indices provide an important starting point for continued in-depth analyses of what UNFCCC transparency arrangements are helping to shed light on, and the potential consequences for enhanced accountability, trust and increased ambition in multilateral climate governance.

Such an analysis is particularly timely, given that the 2015 Paris Agreement calls for an ‘enhanced transparency framework’ to be implemented in 2024, which is intended to build on lessons learned from implementation of current UNFCCC transparency arrangements. A key difference between current requirements and the Paris Agreement’s enhanced transparency framework is that the latter will be applicable to all PartiesFootnote1, while current arrangements specify different (and less stringent) transparency requirements for developing versus developed country Parties (Weikmans et al., Citation2020; Winkler et al., Citation2017). Yet, adherence by Parties even to these pre-Paris differentiated transparency requirements remains both uneven and unevenly understood. Thus, it is important and timely to consider trends and patterns in adherence to transparency requirements by both developed and developing country Parties.

We proceed as follows: Section 2 briefly describes current UNFCCC transparency arrangements, and the differential reporting obligations that apply to developed versus developing country Parties. Section 3 details our data sources and data collection strategy, and methodology to generate our Transparency Adherence Indices. We then present these indices (Section 4), and a related research and policy agenda (Section 5). We conclude in Section 6.

2. Transparency requirements under the UNFCCC

Transparency has long been a vital element of the global climate convention. Under the UNFCCC, each Party has to regularly communicate how it is implementing the Convention (UNFCCC, Citation1992, article 12). There are specific requirements regarding what information is to be reported, how and when. These reports are subject to some form of international review and/or analysis. These transparency requirements have evolved since adoption of the Convention in 1992, with the Paris Agreement’s enhanced transparency framework being the latest iteration (for an historical overview, see Weikmans et al., Citation2020; Winkler et al., Citation2017).

As summarized in below, current transparency arrangements under the UNFCCC are differentiated for developed country (Annex I) Parties and developing country (non-Annex I) Parties.

Under the UNFCCC, the nomenclature ‘Annex I Parties’ refers to industrialized countries that were members of the Organization for Economic Co-operation and Development (OECD) in 1992 when the Convention was agreed, as well as countries with economies in transition. The categorization ‘non-Annex I Parties’ is used in the UNFCCC context to refer to developing countries/non-OECD countries. Developed country (Annex I) Parties are further sub-divided into two groups: so-called ‘Annex II’ Parties (broadly, industrialized countries) and ‘other Annex I’ Parties (broadly, countries with economies in transition). The main distinction in transparency requirements between these two sub-categories of developed country Parties is that Annex II countries have the additional obligation to report on the support that they provide to developing countries, an obligation that countries with economies in transition (other Annex I) do not have. For ease of reading, for the remainder of this article we use the terms developed and developing country Parties to refer to Annex I and non-Annex I countries respectively, except where we need to distinguish between the two sub-categories of developed countries (in which case, we use the terminology of Annex II and other Annex I).

Table 1. Current differentiated reporting and review processes under the UNFCCC.

Differentiated transparency requirements between developed and developing country Parties in the UNFCCC have been the subject of much contestation over the years, reflecting the reality that transparency is not seen merely as a technical matter involving neutral reporting but is a site of political conflict and negotiation (Gupta & Mason, Citation2016; Gupta & van Asselt, Citation2019). In particular, developing countries have resisted calls to increase their reporting obligations, and have their climate actions subject to international oversight. With adoption of the 2015 Paris Agreement, however, an ‘enhanced transparency framework’ now applies to all Parties (see e.g. Weikmans et al., Citation2020; Winkler et al., Citation2017).

Below, we describe further the current differentiated transparency requirements for developed versus developing country Parties (focusing first on reporting, and then on review), before assessing compliance with them in subsequent sections.

2.1. Reporting

One of the first transparency provisions in the UNFCCC is the requirement that all Parties submit regular national reports, called National Communications (UNFCCC, Citation1992, article 12). Such reports are to contain information on, among other things, national circumstances within which climate actions are contemplated, GHG emissions, and mitigation and adaptation measures (UNFCCC, Citation1999). The National Communications of Annex II Parties (the sub-category of developed country Parties with the obligation to provide support) have to include information on support provided to developing country Parties.

National Communications of developing country Parties are to include information on constraints and gaps, and related financial, technical and capacity needs, although these reports are subject to less detailed guidelines than those from developed country Parties.

In addition to National Communications, all Parties to the UNFCCC are required to submit regular GHG inventories, with developed country Parties required to do so annually (UNFCCC, Citation2013). Developing country Parties are not required to submit separate national inventory reports but need to include results of their GHG inventories in their National Communications.

Important changes to these longstanding transparency arrangements were adopted in 2010 and came into force in 2014. These entailed additional reporting and international review/analysis processes for both developed and developing country Parties. These changes followed the new direction towards voluntary climate action pledges from all, enshrined by the 2009 Copenhagen Accord and the 2010 Cancún Agreements (UNFCCC, Citation2009, p. 2010). As per these latest arrangements, developed country Parties have to submit an additional report, a Biennial Report (BR) starting in 2014 (UNFCCC, Citation2011). These Biennial Reports are to contain information on progress in achieving emission reductions, mitigation actions and emissions reductions achieved, projected emissions, and the provision of financial, technical and capacity building support. For developing country Parties, there is now a mandatory requirement to submit a Biennial Update Report (BUR), with an encouragement to submit the first one by 2014. Least Developed Countries (LDCs) and Small Island Developing States (SIDS) can submit their BURs at their discretion. The BURs of developing country Parties should include information on, among other things, national circumstances and institutional arrangements, GHG emissions, mitigation actions, and financial, technical and capacity needs (UNFCCC, Citation2011).

Thus, these new arrangements, being implemented since 2014 signal an important ramping up of transparency requirements from developing country Parties, compared to the earlier obligations relating to national communications and GHG inventories.

2.2. Review

The National Communications submitted by developed country Parties are subject to regular international reviews (for the current guidelines, see UNFCCC, Citation2014). These reviews are organized by the UNFCCC Secretariat and are carried out by ‘expert review teams’, consisting of experts nominated by Parties and experts from intergovernmental organizations. Expert review teams examine the information provided and assess progress made. While the experts are more often than not government officials, the review process is intended to be ‘non-political’ and experts are to serve in their personal capacity. The reviews can be (i) desk-based, with experts reviewing the information individually; (ii) centralized, with experts meeting up to review the information; and (iii) in-country, with experts visiting the country under review. The review reports are made public, though the process allows Parties to respond to the reports before their release. Since 2003, each national GHG inventory submission has also been subject to a technical expert review. Like the in-depth reviews of National Communications, these include desk-based reviews, centralized reviews and in-country visits (the latter at least once every five years). Review reports are also made publicly available. National Communications submitted by developing country Parties are not subject to review.

These differentiated review processes have also changed in the latest iteration of UNFCCC transparency requirements, adopted in 2010 and being implemented since 2014. The newly required BRs from developed country Parties are subject to a process of ‘international assessment and review’. This combines a technical expert review with a state-to-state peer questioning process called ‘multilateral assessment’ (see UNFCCC, Citation2011; Citation2014). The technical expert review of the BRs resembles the review of National Communications and GHG inventories described above. Experts can ask questions and request information from the Party being reviewed, and offer suggestions and advice. The face-to-face multilateral assessment, organized under the auspices of the UNFCCC’s Subsidiary Body for Implementation, entails Parties asking each other questions about the information contained in their reports. Parties can also submit written questions in advance of these sessions. The Multilateral Assessment is held in parallel with on-going political deliberations during annual meetings of the UNFCCC Conference of the Parties, and is recorded. The Secretariat maintains a record of the questions and answers and a summary report is made available on the UNFCCC website.

The newly required BURs from developing country Parties are subject to a slightly distinct process termed ‘international consultation and analysis’ (UNFCCC, Citation2011). This mirrors the two-step process of the international assessment and review described above for developed country Parties, although the terminology varies. Thus, the BURs are analysed (rather than reviewed) by a team of technical experts, in consultation with the concerned Party, and Parties then participate in a ‘facilitative exchange of views’ (rather than a multilateral assessment) although the procedural elements are the same.Footnote2 These differences in terminology signal persisting political contestation around whether or not transparency requirements should be distinct for developed and developing country Parties (Gupta et al., Citation2021).

3. Methodology

In assessing participation in reporting cycles (understood here as submitting BRs and BURs), we have relied on information available on the UNFCCC website.Footnote3 This information details which countries have submitted how many of these reports and when. For the second aspect, adherence to reporting requirements, we use as our primary data source the summary assessments of adherence contained in the technical reviews of BRs and technical analyses of BURs undertaken under the UNFCCC. These expert reports are available on the UNFCCC website.Footnote4 We use these technical reviews/analyses of BRs/BURs as our primary data source because assessing adherence with UNFCCC reporting requirements is an enormous task requiring considerable resources. By one recent estimate (Pulles, Citation2017), the technical review of developed country Parties’ annual GHG emission inventories alone required about 4000 working days per year. We were not able to find any comprehensive figures on the total time devoted to conducting the technical review of developed country Parties’ National Communications and BRs and the technical analysis of developing country Parties’ BURs. Nonetheless, the resources spent on conducting these technical reviews and analyses are significant. Yet, these existing assessments of adherence (as contained in these expert reports, described further below) have not been scrutinized, nor compared in scholarly research. They nonetheless contain important information that can help to reveal patterns and trends which, if analysed and explained, can shed light on what transparency can deliver in climate governance. Hence, we see much value-added in relying on this available primary data source for our analysis.

We describe further below these data sources and our data collection strategy, and the methodology we use to construct the Transparency Adherence Indices. This methodology is different for reports prepared by developed country and developing country Parties (BRs and BURs respectively), given different reporting requirements and UNFCCC criteria for assessing adherence.

3.1. Generating the developed country (BR) Transparency Adherence Index

3.1.1. Primary data sources

As indicated above, we use the technical expert reviews of BRs, conducted since 2014 under the UNFCCC, as our primary source of data. Specifically, we draw on the assessment of adherence to reporting requirements contained in these technical expert reviews. We describe here therefore how these assessments are undertaken and presented in the expert reports.

First, when undertaking technical reviews of BRs, the expert review teams distinguish between mandatory reporting requirements (the ‘shall’ requirements) and other requirements (the ‘should’ and ‘may’ requirements) that developed country Parties have to comply with. For the mandatory ‘shall’ requirements, the expert teams formulate and include in their assessment report a ‘recommendation’ about how to overcome any assessed shortfalls. For non-mandatory requirements, this consists of ‘encouragements’ (rather than recommendations).

Second, each technical review report contains an assessment of the completeness and transparency of the information reported in each of the four main sections of BRs: (i) emissions and removals related to the Party’s quantified economy-wide emission reduction target; (ii) assumptions, conditions and methodologies related to the attainment of the Party’s quantified economy-wide emission reduction target; (iii) progress the Party has made towards the achievement of its quantified economy-wide emission reduction target; and (iv) the Party’s provision of financial, technological and capacity-building support to developing country Parties (UNFCCC, Citation2014; UNFCCC Secretariat, Citation2019).

The assessment by the expert review teams of each section’s completeness and transparency is based on four gradations. For completeness, these include: complete; mostly complete; partially complete; not complete. For transparency, these are: transparent; mostly transparent; partially transparent; not transparent. Thus, expert review teams treat completeness independently from transparency in their assessment of adherence to reporting guidelines, i.e. they provide separate recommendations/encouragements for completeness versus transparency, in relation to each reporting requirement. If the information reported by a given Party in its BR does not fully correspond to the particular reporting requirement of the mandatory guidelines, then the expert review team formulates a recommendation on completeness in its review report. If the information reported by the Party gives rise to questions and does not allow the reader to assess its credibility, reliability and relevance, then the technical experts formulates a recommendation on transparency (UNFCCC Secretariat, Citation2019).

In the expert review teams’ reports, the grade attributed to each of the four sections of each BR depends on the number of recommendations formulated by the expert review team relating to the requirements of that particular section (see Appendix for an overview). A mandatory requirement can only trigger one recommendation on completeness and/or one recommendation on transparency, even if the mandatory requirement contains more than one specific reporting parameter.Footnote5 All mandatory reporting requirements are supposed to be treated equally by expert review teams and an ‘“expert’s weighting factor” should not be applied, as that could imply that some “shall” requirements are more important than others’ (UNFCCC Secretariat, Citation2019, p. 8). Finally, an expert review team also applies ‘a qualitative assessment in its expert judgement in order to make a final determination on the level of completeness and transparency’ (UNFCCC Secretariat, Citation2019, p. 9).

3.1.2. Data collection and analysis

To generate our database, we compiled completeness and transparency grades for BRs of each developed country Party, with a breakdown according to the four main sections of BRs (see supplemental material 1). We synthesized this information from the 123 technical review reports available on the UNFCCC website for all BRs submitted since 2014, through February 18th, 2020. These cover the first, second and third BRs reviewed by expert review teams. Our database also contains information on the timeliness of the submission of each BR.

3.1.3. Construction of the BR ‘Transparency Adherence Index’

As a next step, we drew on our compiled database to construct a BR ‘Transparency Adherence Index’. We developed this by generating an equally weighted average of four indices, each representing adherence to the mandatory requirements of one of four sections of BRs. In each of these, we accorded points for completeness (40%), transparency (40%), and timeliness (20%), according to the scorecard (which we developed) presented in Appendix . Our scorecard uses a scale of 0 (weakest note) to 3 (top note) to grade the level of completeness, transparency and timeliness.

For example, if a particular section of a BR submitted before the deadline (timeliness score: 3) was assessed by the expert review team as ‘mostly complete’ (completeness score: 2) and ‘partially transparent’ (transparency score: 1), then the adherence value of this particular section of the report is: (3/3*0.2) + (2/3*0.4) + (1/3*0.4) = 0.5 or 50%.

3.2. Generating the developing country (BUR) Transparency Adherence Index

3.2.1. Data source

As with the BRs, our data source to assess adherence by developing country Parties to reporting requirements were the technical expert analyses of individual BURs, undertaken under the auspices of the UNFCCC. This technical analysis identified the extent to which information to be reported, as per BUR guidelines (UNFCCC, Citation2011), was included in each report. This information included: (i) the national GHG inventory report; (ii) information on mitigation actions, including a description of such actions, an analysis of their impacts and the associated methodologies and assumptions, as well as the progress made in their implementation and information on domestic measurement, reporting and verification; and (iii) information on support needed and received. It is important to mention that most of these requirements are not mandatory. Mandatory requirements only concern certain provisions related to (i) the national GHG inventory report;Footnote6 and (ii) information on mitigation actions.Footnote7

For each BUR, the technical team of experts identified the extent to which information for each reporting category was included in the report, by using one of the three following comments: ‘yes’; ‘partly’; and ‘no’.

3.2.2. Data collection and analysis

Drawing on the data source above, we generated a database (see supplemental material 2) that compiled the grades attributed to each of the mandatory reporting requirements in BURs. We drew on a total of 72 technical summary reports available on the UNFCCC website, covering BURs submitted between 2014 and February 18th, 2020. The database also contains information on the timeliness of the submission of each BUR.

3.2.3. Construction of the BUR ‘Transparency Adherence Index’

Our resultant BUR Adherence Index is an equally weighted average of two indices; each of these captures the extent to which mandatory information was included in each of the two BUR sections: (i) the national GHG inventory report and (ii) information on mitigation actions.

We made the decision to include only the (limited) mandatory reporting requirements for developing country Parties in our database in order to ensure fairness and accuracy in assessing adherence to reporting requirements across developed and developing country Parties, since we also assessed only mandatory requirements for developed country Parties. However, this meant in practice that certain key categories of information that are particularly relevant for developing country Parties, such as international support needed and received or adaptation actions, are not captured in our BUR Transparency Adherence Index, since reporting on these elements is voluntary. We return to this point in our discussion.

For the first BUR of all developing country Parties and for all BURs of those with LDC and/or SIDS status, our indices only reflect the extent to which the mandatory elements of information are included (rather than also timeliness). This is because the start date of 2014 for the first BUR from all developing country Parties is encouraged rather than mandatory; and LDCs and SIDs can submit all BURs at their discretion. For the second and third BURs of developing country Parties that do not have LDC and/or SIDS status, our indices reflect the extent to which the mandatory elements of information are included (80%) and timeliness (20%), according to the scorecard (which we developed) presented in Appendix .

4. Complying with UNFCCC reporting requirements: ‘Transparency Adherence Indices’

In this section, we describe our findings on country compliance with UNFCCC transparency requirements. We do so through presenting our BR and BUR Transparency Adherence Indices, which synthesize the information compiled in our databases on frequency of country participation in reporting cycles and adherence to mandatory reporting requirements.

4.1. Developed country compliance with UNFCCC transparency requirements

4.1.1. Participation in reporting cycles

Our analysis reveals a high level of participation in UNFCCC reporting cycles by developed country Parties, in terms of submission of BRs. All 44 Annex I Parties have each submitted the three BRs required during our study period, with the exceptions of Ukraine (which has only submitted its first BR) and the United States of America (which has not submitted its BR3, due by the 1st of January 2018 during the Trump administration’s tenure).

In terms of timeliness of submission, two findings are worth mentioning. First, as shown in Appendix , the sub-category of developed country Parties with the obligation to provide financial support (Annex II Parties) submitted their third BRs in a more timely manner than did other Annex I Parties (economies in transition). Second, for all developed countries, the third BR submissions were on average made in a timelier manner than submission of the first BRs. As such, a clear finding is that timeliness is improving across the board and over time for developed country Parties, even as the timeliest submissions are still by the sub-category of Annex II Parties. This is a relatively unsurprising finding, insofar as it confirms that the richest industrialized countries, those with the longest-standing experience with UNFCCC reporting and highest capacities to do so, are submitting reports in a more timely manner.

4.1.2. Adherence to reporting guidelines: the BR Transparency Adherence Index

The findings of our systematic analysis of adherence to reporting requirements are summarized in below, our BR Transparency Adherence Index. This Index ranks developed country Parties according to the average adherence score of their three BRs. These indices bring together each country’s completeness and transparency scores in a composite index, as described in our methodology section. We have also calculated separate scores for each in our database, which is available as supplemental material 1.

Table 2. BR Transparency Adherence Index (Annex I Parties)

Our results display a wide range of performance across developed country Parties in adhering to mandatory reporting requirements. As shows, Finland leads in adherence to mandatory UNFCCC reporting requirements amongst the sub-category of Annex II Parties, with an average adherence score across its first three BRs of 96%. Australia and Japan are in the top five as well, as are the European Union, Germany, the Netherlands, Sweden and the United Kingdom (with some of these countries tying for fifth place with the same score). The United States is last on the list of Annex II Parties, primarily because it did not submit a third BR. Its adherence scores for the first two submitted BRs are very high, however, and are comparable with the scores of the top five compliant Annex II Parties.

With regard to other Annex I Parties (broadly, countries with economies in transition), Turkey leads the BR Transparency Adherence Index, with a high adherence score of 93. This is primarily because Turkey has no GHG emission reduction target and is thus not obliged to report on assumptions, conditions and methodologies related to attainment of a quantified economy-wide emission reduction target and progress in achievement of targets. The others in the top five of the list include Estonia, Czech Republic, Romania and Lithuania. Thus, the Baltic states, for example, display high adherence to UNFCCC reporting requirements. The Russian Federation occupies the 12th rank out of a total of 20 ‘Other Annex I Parties’, with an average adherence score across its three submitted BRs of 77. Finally, Ukraine appears last on the ‘Other Annex I Parties’ list, with an average adherence score of 24, yet this is because it has submitted only one BR.

As also shows, while there is varying adherence to reporting requirements across developed country Parties (both Annex II and other Annex I), we do not see any pattern of consistent improvement with regard to adherence over time in submission of successive BRs – i.e. on average, adherence with reporting requirements does not significantly improve through the BR submission cycle. Another finding is that the average adherence score of Annex II Parties for each BR submission cycle is systematically higher than that of ‘Other Annex I’ Parties. As we noted earlier, both groups of developed country Parties, Annex II Parties and other Annex I Parties, are subject to the same mandatory reporting requirements for their BRs.Footnote8 The only exception is that Annex II Parties are required to report on provision of support to developing country Parties, while other Annex I Parties do not have such an obligation.

The adherence patterns identified in raise various questions meriting further empirical analysis, including why there is the observed divergence in adherence between developed country Parties, why adherence varies (or not) over time for a given Party, and what a given adherence level may signify in terms of political versus technical hurdles to climate reporting. We address these further in our discussion section.

builds on our synthesis of by distinguishing the average adherence score per section of developed country Parties’ BRs. A number of trends are identifiable here. First, the table shows that the average adherence score of Annex II Parties is systematically higher than that of other Annex I Parties, for each section of the BRs. Second, we see that reporting requirements relating to GHG emissions and trends are on average adhered to the most, by both Annex II Parties and other Annex I Parties. By contrast, it is striking that they are generally struggling to adhere to the reporting requirements on achievement of targets.

Table 3. Developed country Parties’ average adherence score per section of the BRs.

The patterns observed in also raise important questions: why is the adherence score for achievement of targets significantly lower across the board for developed country Parties, compared to other categories of reporting? What political dynamics can help to explain this? Clearly, this is not only a technical struggle, since reporting on technically complex aspects such as GHG inventories are highly adhered to.

For Annex II Parties, the level of adherence to requirements about provision of support is even lower than that relating to achievement of targets. This too is an important trend to examine further and explain: why is this mandatory reporting category less adhered to? What prevents more complete adherence to a reporting requirement that is clearly of great political salience in this multilateral climate context?

Finally, looking at the different BR sections, there is no clear pattern of improvement over time. Average adherence scores rise and fall over time across the different sections, but mostly level off. The only exception is Annex II Parties adherence to reporting requirements for provision of support, which has improved over time .

This last point raises additional questions. First, even as it remains the lowest adherence score in overall terms, why has reporting on provision of international support improved over time? Is this because of higher levels of support being disbursed by these countries, which they are then willing to report? Or because definitional challenges and contestations relating to what constitutes climate finance and support are becoming clarified over time (but research suggests this is not the case, see Roberts & Weikmans, Citation2017; Weikmans & Roberts, Citation2019)? Or because countries are learning through the submission of BRs how to report to satisfy the technical teams of experts? Or rather because of political pressure around the need for increased clarity on support emerging from parallel negotiation of the 2015 Paris Agreement?Footnote9 These possibilities suggest an important research agenda around the dynamics and political implications of reporting on climate finance that merit further study.

We discuss further these findings and the additional questions that they raise in the discussion section below, after first presenting developing country adherence trends next.

4.2. Developing country compliance with UNFCCC transparency requirements

4.2.1. Participation in reporting cycles

With regard to developing country participation in reporting cycles, our synthesis reveals that 55 (out of 156) non-Annex I Parties have submitted a BUR so far. Our data also show that the level of engagement is very different between developing country Parties that are LDCs and/or SIDS and others (see ). This is not surprising given that LDCs and SIDS can submit BURs at their discretion.

Table 4. Developing country Parties’ engagement in submitting BURs.

shows that 46 (out of 77) developing country Parties with neither LDC nor SIDS status have submitted their first BUR and only nine have submitted three BURs. Furthermore, only seven developing country Parties with neither LDC nor SIDS status submitted their BUR1 in a timely manner (i.e. by December 2014). Nine (out of 79) Parties with LDC and/or SIDS status have submitted their first BUR so far, with only one Party (Singapore) with SIDS status having submitted three BURs.

These trends and patterns raise a number of important questions. How can these varying patterns of engagement be explained? Why are some states (with non-LDC/SIDS status) participating more than others? Are the challenges of engaging in multilateral transparency arrangements ‘technical’, i.e. relating to lack of data and lack of capacity to monitor, measure and report on greenhouse gas emissions or climate actions? Are they related to difficulties in accessing international financial resources aimed at supporting the timely preparation of BURs? Or do they relate to lack of priority for climate transparency reporting? Or to allocation of scarce resources for reporting on adaptation, for example, rather than reporting mitigation actions? These propositions have some basis in secondary and grey literature but have been little examined systematically.

4.2.2. Adherence to reporting guidelines: the BUR Transparency Adherence Index

In addition to frequency of reporting, our analysis reveals a wide variation in adherence of developing country Parties to mandatory reporting requirements. Given the large number of these Parties, the complete adherence scores for each are provided in supplemental material 2.

, our BUR Transparency Adherence Index, shows the adherence scores of the twenty biggest GHG emitters among developing country Parties. It is important, first and foremost, to note that we have listed countries in this Index according to emission levels rather than highest to lowest adherence. The most striking finding here is that seven of these twenty countries had not submitted even their first BUR by February 2020 (i.e. more than four years after the suggested deadline to submit the first BUR). Another key finding is that, while the adherence score of some of these twenty countries (e.g. India and Argentina) has greatly improved from BUR1 to BUR2, this is not a general trend, as can be seen from the adherence scores of Parties such as Brazil, South Africa and Thailand.

Table 5. BUR Transparency Adherence Index (top 20 GHG emittersTable Footnotea among non-Annex I Parties).

Also noteworthy is that eight BURs (i.e. BUR1 of Brazil, China, Ghana, the Republic of Moldova, and Singapore; and BUR2 of India, Malaysia and Singapore) have achieved full adherence to the reporting requirements, as per the methodology for such assessment employed by UNFCCC teams of technical experts. Yet, this raises a salient question: what is the significance of having 100% adherence to reporting requirements? Does it signal adequate capacities in data availability and information generation within a country on aspects covered by mandatory reporting, such as GHG inventories or mitigation actions, with no need for further improvements? Or does the 100% adherence score simply capture improved ability to report over time in a manner consistent with technical analysis guidance? And if so, what might this imply about Brazil’s lower adherence score for BR2? Another striking example is South Africa, which has consistently voiced support for UNFCCC transparency arrangements in international negotiations but shows relatively low adherence scores over time, according to our synthesis.

While gives an indication of adherence scores over time of the top 20 emitters amongst developing country Parties, lists the average adherence score per section of all BURs.Footnote10 It also distinguishes between the adherence score of LDC and/or SIDS Parties on the one hand, and other developing country Parties on the other hand. These results should be considered with caution – especially those related to BUR3 and to LDCs/SIDS – given that they are based on a very limited number of BURs (i.e. only those for which a technical summary report was available by February 18th, 2020). Nonetheless, a clear trend discernible from the Table is that developing country Parties generally adhere more to reporting requirements for national GHG inventories, as compared to reporting on mitigation actions.

Table 6. Developing country Parties’ average adherence score per section of the BURs.

This is an important finding that merits more detailed empirical scrutiny. Is this because GHG inventories are prioritized within the capacity building initiatives to help developing country Parties comply with mandatory UNFCCC transparency requirements? Furthermore, since our Index captures adherence only to mandatory reporting requirements, it does not tell us how this group of countries are faring with regard to reporting on aspects other than GHG inventories and mitigation, in particular adaptation actions and assessing support needed and received.

also reveals that the average adherence scores for this group of countries as a whole has improved from BUR1 to BUR2, but then appears to have declined again somewhat between BUR2 and BUR3. This suggests again a need for further detailed empirical analysis to understand technical, political and capacity-related explanations for these observed trends.

5. State compliance with transparency requirements: a research agenda

In presenting our Transparency Adherence Indices above, we raised some questions in connection with specific adherence patterns. Here we identify additional overarching questions and a future research and policy agenda that these Indices help to draw attention to.

First, an important task is to explain identified levels of engagement in UNFCCC transparency arrangements, as reflected in the Transparency Adherence Indices. In particular, it is important to understand why adherence scores are significantly lower for some countries, or for some categories of reporting, rather than for others. In explaining and comparing varying adherence to the different categories of reporting, it may well be that some requirements are technically ‘easier’ to adhere to than others, while others are more politically sensitive (Weikmans et al., Citation2020). Yet such a hypothesis requires systematic, comparative empirical scrutiny.

A related common presumption is that there are severe capacity constraints facing countries, particularly developing countries, in engaging meaningfully in UNFCCC transparency processes. These constraints may partly explain the relative lack of engagement in reporting by developing country Parties. However, the ‘struggle’ to report is clearly not only a technical one, as the Indices reveal that reporting on technically complex aspects such as GHG inventories are generally highly adhered to, both by developed and developing country Parties, at least by those who engage in reporting. By contrast, developing country Parties struggle with reporting on mitigation actions, and developed country Parties generally lack adherence to reporting requirements linked to achievement of targets. For Annex II Parties (developed countries with an obligation to provide support), the level of adherence to reporting requirements about provision of support is even lower than that relating to achievement of targets. This is striking, given that these are highly politically salient categories of reporting and action in the climate context.

Another hypothesis to explain varying adherence is that there might be active political reluctance to report on national climate actions (or lack thereof) to multilateral institutions such as the UNFCCC, as illustrated by the lack of submission of BRs from the United States during the Trump administration’s tenure. Likewise, it may be that developed country Parties that have signalled political reluctance to engage in top-down mitigation targetsFootnote11 are also less prone to adhere to the reporting requirements about mitigation. This suggests a pressing need to look at domestic political contexts as well, to understand diverse patterns of engagement with and adherence to UNFCCC transparency requirements, both comparatively between Parties and over time for a given Party.

Second, an overarching set of questions pertains to the merits of the technical assessment of adherence as institutionalized within the UNFCCC, which we have drawn upon here to generate our Indices. Is the current process of assessing state ‘compliance’ with mandatory reporting requirements through the technical expert review/analysis system adequate and relevant, or is it superficial? This relates also to the questions we raised in the previous section about what a 100% adherence score implies. Given that diverse technical teams assess national reports, are there also differences in the stringency of assessment, with some more lax in their reviews, and does this matter? Is this process of technical assessment able to capture important dynamics in participation and adherence that are key to securing the ultimate aims of enhanced transparency: to further accountability, mutual trust and ambition?

More specifically, what are the political implications of these ‘technical’ assessments of country reports? What is not being assessed in these technical assessments? And what is not being asked of Parties to report on in the first instance? Even with complete adherence with UNFCCC reporting requirements, transparency might yet fail to be transformative if the scope of transparency is limited by political considerations or constraints of national sovereignty. Or, as the example of reporting relating to international support suggests, it may be hard to assess adherence to this category of reporting, given the large margin for discretion in interpreting what constitutes support (see Weikmans & Roberts, Citation2019). It might be assessed as being in full adherence but this does not remove underlying political contestations over the meaning of support (Ciplet et al., Citation2018).

Third and finally, our Transparency Adherence Indices lead us to advance an intriguing hypothesis meriting further empirical scrutiny: that a Party exhibiting more climate ambition in the first instance (in terms of mitigation or provision of international support) may perform better in adhering to UNFCCC transparency requirements as well. Thus, instead of transparency stimulating greater climate ambition, greater ambition may result in greater adherence to transparency. Yet, if this is the case, the question arises whether transparency follows rather than shapes political developments and dynamics. Our Transparency Adherence Indices shed light on some of these dynamics and set the stage to explore their implications further.

6. Conclusion

This article has presented a comprehensive overview of the extent to which countries that are Parties to the UNFCCC are engaging with climate transparency requirements. We presented a two-part ‘Transparency Adherence Index’ (the BR and BUR indices) to distil dynamics of participation and varying patterns of adherence to reporting requirements by both developed and developing country Parties since 2014. To construct our indices, we used as our primary data source the summary assessments of adherence contained in the technical reviews of BRs and technical analyses of BURs undertaken by teams of experts under the UNFCCC.

As seen above, our Transparency Adherence Indices provide a quantitative measure of varying adherence patterns that are comparable over time and across Parties (within the categories of developed and developing country Parties – it is important to note that the numbers are not comparable across these two categories, given different reporting requirements and criteria for assessing adherence). Generating these indices is a necessary first step to furthering an important research agenda centred around explaining the observed trends and patterns. We have thus also highlighted general and specific questions arising from the trends discernible in our Transparency Adherence Indices, which constitute a timely future research and policy agenda.

Ultimately, addressing these questions is necessary in order to shed light on the transformative potential of transparency, in practice, within multilateral climate governance. And herein lies the policy relevance of the adherence assessment undertaken in this article. Our focus here was on the latest reports (the BRs and the BURs) generated by developed and developing country Parties since 2014. Yet these reports form the bedrock upon which the Paris Agreement’s ‘enhanced transparency framework’ is being built, with the two parallel reporting trajectories to be replaced by one ‘Biennial Transparency Report’ to be submitted by all countries starting in 2024. If so, the adherence indices we have presented here provide vital context for assessing the continued relevance of ever-expanding transparency requirements within the UNFCCC. Specifically, the adherence patterns in our Indices shed light not only on what engagement looks like currently, but also raise the question of what constitutes meaningful adherence in the first instance. They permit the posing of an important question: To what extent is (more or less) adherence to transparency requirements linked to more far-reaching climate action, if at all? Related to this, the Indices can help to pinpoint areas of adherence that may need particular attention going forward (such as reporting on transparency of support; or voluntary elements of reporting, such as reporting on adaptation). Finally, it is important, in designing and implementing the Paris Agreement’s enhanced transparency framework, to continually reconsider the resource-intensive nature of the technical expert review process, and the manner in which adherence is assessed therein, an aspect that our Indices also shed light on.

As a concluding observation, we note that the Transparency Adherence Indices that we generate here can also be used, in and of themselves, to benchmark countries. This may have the consequence of incentivizing states to improve adherence to UNFCCC reporting requirements. While the assumption that ‘naming and shaming’ or ‘naming and praising’ can transform behaviour towards more environmentally ambitious actions is widespread in the transparency literature, it too remains empirically under-scrutinized. Thus, a final research question going forward is whether making visible patterns of state adherence to transparency, as we do here through our Indices, can help stimulate changes in behaviour. Or rather, whether the political and structural constraints underpinning the observed levels of engagement cannot be overcome merely by shedding greater light on them. Thus, whether making transparent state efforts to be transparent is transformative of behaviour is also a key question our analysis raises.

Supplemental material

Supplemental Material

Download Zip (103 KB)

Acknowledgements

The authors thank Fabian Vallespin for his help with the coding. We also thank three anonymous reviewers for their thoughtful comments and suggestions for improvements. We are grateful to the Editor for her valuable guidance and support.

Disclosure statement

No potential conflict of interest was reported by the authors.

Additional information

Funding

This work was supported by the Fonds de la Recherche Scientifique (F.R.S.-FNRS) under grant number 32709873 (Romain Weikmans); and by the Nederlandse Organisatie voor Wetenschappelijk Onderzoek (NWO) funded TRANSGOV project (Aarti Gupta).

Notes

1 The enhanced transparency framework however provides for ‘built-in flexibility that takes into account Parties’ different capacities’ (UNFCCC, Citation2015, article 13.2). Flexibilities will be available to developing country Parties on specific elements concerning the scope, frequency and level of detail of reporting and review (see UNFCCC, Citation2019, Annex, paragraph 6; for an analysis of these flexibilities, see Weikmans et al., Citation2020). In addition to these flexibilities, Least Developed Countries (LDCs) and Small Island Developing States (SIDS) will have further discretion in the implementation of their obligations.

2 For the International consultation and analysis, LDCs and SIDS can be analysed in groups rather than individually.

5 There are two exceptions to this principle: ‘(a) where one ‘shall’ requirement contains an additional mandatory reporting requirement, such as in the case of the reporting of projections; (b) where the same paragraph of the UNFCCC reporting guidelines on BRs contains multiple mandatory reporting requirements which are interdependent, namely, paragraphs 6, 14, 15, 17 and 22 (see suggested approach described under the review challenge C.6 below.)’ (UNFCCC Secretariat, Citation2019, pp. 8–9).

6 For details, see UNFCCC (Citation2002, Annex, para. 14); UNFCCC( Citation2011, para. 41(g)).

7 For details, see UNFCCC (Citation2011, Annex III, para. 12).

8 Adherence score per section and per BR is provided for each Annex I Party in supplemental material 1.

9 A key political dynamic in the negotiations around an ‘enhanced transparency framework’ applicable to all under the Paris Agreement was to give parity to transparency of mitigation actions and transparency of support.

10 The adherence score per section and per BUR is available for each developing country Party in supplemental material 2.

11 Belarus, Canada, Japan, Kazakhstan, the Russian Federation, Ukraine and the United States of America are Annex I Parties that have not ratified the Doha Amendment of the Kyoto Protocol that establishes a second emission reduction commitment period running from 2013 to 2020.

References

Appendix

Table A1. Completeness and transparency assessment scoreboard for BRs

Table A2. Scorecard used to construct the Adherence Index of each section of a BR

Table A3. Scorecard used to construct the Adherence Index of each section of a BUR

Table A4. BRs submission timeliness