GENEVA MANUAL – CHAPTER 1

On Responsible Behaviour in Cyberspace

Implementation of norms to secure supply chains and encourage responsible reporting of ICT vulnerabilities: Who needs to do what?

In dealing with a critical vulnerability, who is expected to do what in order to minimise security risks?

To answer this question, the international community fortunately has the framework we previously introduced. This framework helps us define the expectations for achieving cyber-stability. As mentioned earlier, the framework includes non-binding norms, among other elements, with two particular norms of special relevance for our discussion about ICT vulnerabilities and supply chain risks:

13i “States should take reasonable steps to ensure the integrity of the supply chain so that end users can have confidence in the security of ICT products. States should seek to prevent the proliferation of malicious ICT tools and techniques and the use of harmful hidden functions.”

13j “States should encourage responsible reporting of ICT vulnerabilities and share associated information on available remedies to such vulnerabilities to limit and possibly eliminate potential threats to ICTs and ICT-dependent infrastructure.”

UN GGE report

However, these norms are by default abstract and general in scope – and voluntary in nature. Who should read them – and how?

Unpacking the two norms: What did States specifically agree about, and do other stakeholders concur?

While not legally binding, both norms are seen as a collective understanding confirmed by all UN Member States on how to ensure a safer digital landscape. In 2021, States confirmed the eleven cyber norms, as part of the cyber-stability framework, and agreed upon the implementation points for each of them. However, a deeper contemplation of concrete suggestions and steps opens numerous questions.

In particular, when discussing norm 13i (related to supply chain security), States the broad measures such as putting in place, at the national level, transparent and impartial frameworks and mechanisms for supply chain risk management to more narrowly define ones, (e.g. putting in place measures that prohibit the introduction of harmful hidden functions and the exploitation of vulnerabilities in ICT products). The 2021 UN GGE report clarifies that States are primary responsible actors for implementing this norm. However, at the same time, states agreed that the private sector and civil society should assume a relevant role in the process. What can be concrete responsibilities for these stakeholders? The norm does not clarify this issue further.

With regard to norm 13j (related to responsible reporting of ICT vulnerabilities), the language remains less detailed and specific. The norm promotes a necessity for ‘timely discovery and responsible disclosure and reporting of ICT vulnerabilities’. The norm also mentions that states could consider developing impartial legal frameworks, policies, and programmes on vulnerability handling; develop guidance and incentives, and protect researchers and penetration testers. These measures would find broad support across cybersecurity experts, users, and other stakeholders; however, details are critical – what do ‘impartial legal frameworks’ mean? How will states protect researchers and penetration testers? And what would ‘responsible reporting’ entail? To whom should vulnerabilities be reported to ensure responsible reporting? The norm does not clarify this either.

Discussions with the Geneva Dialogue experts have highlighted that these questions are just as important and on the minds of stakeholders. They have raised additional concerns, such as how to tackle the current geopolitical challenges arising from technological competition between countries and the different rules and regulations in this field. These challenges and risks of conflicting rules and laws in this field across countries can present hurdles for researchers and industry players trying to collaborate across borders to put these norms into action.

The role of governments in the implementation of these norms raised another concern, especially in regards to the states who have advanced cyber capabilities to stockpile vulnerabilities for their cyber offensive and defensive programs. How to build trust between relevant non-state stakeholders and governments to implement these norms and encourage responsible vulnerability disclosure? How to facilitate information exchange to implement these norms between states and relevant non-state stakeholders, as well as between different states?

The Geneva Dialogue experts have also expressed concerns about the implementation of the norm 13i on supply chain security. In particular, it has been noted that the ICT supply chains now involve multiple stakeholders, and that no single entity has complete control over them. The complexity of these supply chains, with various participants and cross-border data flows, makes achieving optimal security challenging. Each organisation makes security decisions based on its resources and capabilities, which may not align with the security needs of others. The absence of universally accepted methods for conducting evidence-based security assessments in supply chain security poses challenges for organisations of different sizes. They must make security choices and decide which digital products and suppliers can be trusted. All these decisions often have an immediate impact on the security of customers and users. In this context, the Geneva Dialogue experts stressed the need for globally accepted rules and standards for supply chain security, promoting security by design and default in digital products. However, is it possible to develop such rules today, and is there an appropriate international platform for facilitating these discussions?

While norms set expectations, translating them into practical actions is of the essence. The Geneva Dialogue experts supported translating the norms as non-binding diplomatic agreements into more tangible processes, policies, and regulations. The key questions are how to develop such policies and regulations, and where to establish them. What should be the fundamental principles guiding the creation of such policies and regulations to effectively implement the essence of the norms?

With many open questions, the consultations with the Geneva Dialogue experts showed that relevant non-state stakeholders support the norms negotiated by states: if properly implemented, they can help significantly increase the security and stability in cyberspace. But the ‘devil is in the details’ and the key caveats are about ‘if’ and ‘properly implemented’ – what would this mean in practice?

With the Geneva Manual, we launch a global conversation on how the norms implementation for the security of cyberspace can become a reality or, where it is already a reality, what can be improved. Based on the idea that achieving effective cybersecurity requires continuous cooperation and commitment from all involved parties, we have outlined suggestions as to ‘who should do what.’ With the help of our story (inspired by real events), we explore different roles within various stakeholder groups and delve into what each role can include, and could contribute to. This involves understanding the expectations, motivations, incentives, and challenges faced by these groups. Through the regular discussions with the Geneva Dialogue experts, we also discovered some good practices that can inspire others in the international community to play their part in promoting cyber-stability.

Implementation of the two norms: Roles and responsibilities to achieve cyber-stability

We often say that cybersecurity is a team effort, but how can we ensure that such a ‘team’ works together effectively? To address this, we collected the views of the Geneva Dialogue experts: these multistakeholder inputs helped us analyse where roles start and end, which drivers are needed to incentivise responsible behaviour across relevant non-state stakeholders, and which challenges remain unsolved, therefore requiring further attention of the international community.

Role: Manufacturer and/or supplier of digital products
The role refers to a company or entity that produces or provides digital/ICT products and services, including software, hardware or a system configuration.

The role applies to small and medium-sized manufacturers and suppliers as well; however, not all suggested steps below are implementable by them, and certain prioritisation may be needed.

The private sector

As a result of consultations with the Geneva Dialogue experts, manufacturers have been named as the ones who are expected to have the primary responsibility to address ICT supply chain risks and risks from vulnerabilities in digital products to ensure the security and safety for customers and users.

In particular, this responsibility, as collective expectations from users of digital products, entails the following:

  1. Implementing security by design practices in the development of digital products throughout their lifecycle and supply chain1Paris Call Principle #6 on Lifecycle Security: “Strengthen the security of digital processes, products and services, throughout their lifecycle and supply chain”.
    https://pariscall.international/en/principles
    Charter of Trust Principle #2 on Responsibility throughout the digital supply chain and Principle #3 on Security by default
    https://www.charteroftrust.com/wp-content/uploads/2023/07/Charter-of-Trust_Principles_EN_2023-07-25.pdf
     in line with international standards and recognized security good practices
  2. Conducting security risk assessments of suppliers and digital products, including software from third parties and open-source components
  3. Evaluating and regularly updating an inventory of supplier relationships, contracts, and any products those suppliers provide
  4. Maintaining, regularly updating, and providing upon request information about the composition of its products, including those about integrated third-party and open-source components (known as Software Bill of Materials (SBOM) and/or Hardware Bill of Materials (HBOM)
  5. Indicating the expected product lifecycle during which users can expect security updates and security support2Charter of Trust Principle #2 on Responsibility throughout the digital supply chain:
    “Continuous protection: companies must offer updates, upgrades, and patches throughout a reasonable lifecycle for their products, systems and services via a secure update mechanism.”
    https://www.charteroftrust.com/wp-content/uploads/2023/07/Charter-of-Trust_Principles_EN_2023-07-25.pdf
  6. Implementing vulnerability disclosure and management processes,31) Paris Call Principle #5 on Non-proliferation: “Develop ways to prevent the proliferation of malicious software and practices intended to cause harm” – https://pariscall.international/en/principles
    2) GCSC Norm #6 to Reduce and Mitigate Significant Vulnerabilities: “[…] While it is currently very difficult to ensure that no vulnerabilities exist in newly released or updated products, rather, this proposed norm suggests that those involved in the development or production of such products take “reasonable steps” that would reduce the frequency and severity of those that do occur.[…] this proposed norm seeks to have those who develop or produce critical products take reasonable measures to ensure that the number and scope of critical vulnerabilities are minimized and that they are effectively and timely mitigated and, when appropriate, disclosed when discovered. The process used should be transparent to create a predictable and stable environment.” – https://hcss.nl/gcsc-norms
    3) Geneva Dialogue output ‘Security of digital products and services: Reducing vulnerabilities and secure design: Good practices’ – https://genevadialogue.ch/goodpractices/
    i.e. responding to vulnerability reports and coordinating actions, where needed, with relevant parties (e.g. national authorities, CERT/CSIRT, researchers, other vendors, OSS community) to remediate vulnerabilities (researchers’ expectations from manufacturers)
  7. Utilising standardised formats for vulnerability exchange (e.g. VEX) to allow automatisation and quicker response to identify a product or products that are affected by a known vulnerability or vulnerabilities
  8. In case of discovered and reported vulnerabilities in open-source software, conducting timely communication with the OSS development team and notifying them about the vulnerability or fix (OSS community’s expectations from manufacturers)
  9. Building specialised security teams and developing effective organisational structures to promptly address vulnerabilities and security threats (researchers’ expectations from manufacturers)
  10. Proactively informing affected customers and users, and national authorities, where required, as a first priority, about the released patch
  11. Assisting customers to help ensure that their products are deployed in a secure manner and communicating to the customers how to continually ensure the security of their digital products in deployment
  12. Utilising certification and standardisation bodies 4Charter of Trust Principle #7 on Cyber-resilience through conformity and certification
    https://www.charteroftrust.com/wp-content/uploads/2023/07/Charter-of-Trust_Principles_EN_2023-07-25.pdf
    , as well as industry and trade associations to team up with other manufacturers, technical communities, and relevant civil society organisations and academia to develop interoperable global rules and standards for supply chain security
  13. Having an up-to-date software maintenance plan that includes alternative software components which can be used if an OSS developer fails to respond or patch vulnerable libraries

The key incentives include:

  1. Regulatory pressure and liability for software security
    The Geneva Dialogue experts have discussed the need for governments and policymakers to step in and set standards to ensure the security and safety of digital products. They have been debating the elements of a legal framework that would be widely accepted but, so far, there was no agreement among stakeholders on how to strike the right balance.
    Some of them have called for stronger accountability when companies failed to address vulnerabilities promptly. However, there is a consensus that it would be unrealistic to expect 100% security and hold manufacturers responsible for the existence of vulnerabilities themselves, as technology is rapidly evolving and the threat landscape is constantly changing.
    To improve security in digital products and help consumers make better choices, experts agree that standardisation, certification and labelling schemes are of the essence. Standardisation is an important tool to raise the cybersecurity bar in organisations and products. Technical standards, defined by consensus-building and inclusiveness, provide a minimum set of requirements that help organisations achieve their cybersecurity goals, with an impact on the global ecosystem. Once the cybersecurity standards are agreed, the demonstration of the conformity to these standards is also strategic; for instance, to comply with relevant legislations and regulations, but also generally to give trust within the market (i.e. to customers).These measures can stimulate stronger security in digital products, address the information gaps, and empower users to make more informed purchases.
    However, the focus of regulation should not only be on the end product. Instead, the emphasis should be on defining and assessing robust cybersecurity processes. For instance, rather than mandating manufacturers to produce products completely free of vulnerabilities, regulations should require them to establish strong cybersecurity processes that continuously test products and promptly address any vulnerabilities discovered or reported. This way, the emphasis is on building a proactive and effective security approach.
  2. Pressure from customers and users to adhere to security standards
    The Geneva Dialogue experts who represent manufacturers of digital products stressed that their customers and users are the main drivers who request greater security in products. To meet such customer demands, manufacturers are compelled to perform compliance checks and ensure that their products adhere to industry security standards. Failing to do so could lead to a loss of customer trust and, in some cases, legal liabilities (in the event of security breaches or vulnerabilities, for example). If a digital product is found to be insecure and leads to data breaches or other security incidents, the manufacturer can face legal consequences, reputational damage, and financial losses.
    Therefore, the fear of losing customers and facing potential legal consequences acts as a strong indirect incentive and pressure for manufacturers to continuously enhance the security of their products.
  3. Market competition
    Benchmarking against competitors pushes manufacturers to meet, or exceed, the existing security standards and, thus, this form of peer pressure drives a culture of continuous improvement in security practices. At the same time, interconnected supply chains and business partnerships which create benefits – from accessing valuable information to being authorised to large partners’ ecosystems – create certain expectations of a trusted and reliable company, where security becomes one of the key criteria.
  4. Security risks
    When security breaches happen within the industry, companies closely observe these incidents and their repercussions on the affected organisations. Such incidents serve as cautionary precedents, motivating companies to assess their own security posture and invest in preventive measures to avoid similar vulnerabilities.
  5. Reputational risks
    A security breach, or revelations of poor security practices that result in security risks for users, can cause significant harm to a company’s reputation, undermine customer trust and loyalty, resulting in a decline in business. The fear of being seen as untrustworthy and unreliable in the eyes of stakeholders (including government stakeholders and regulators) and customers pushes manufacturers to build a proven track record of strong security measures and a dedicated focus on cybersecurity.

The Geneva Dialogue experts have been asked about factors which prevent manufacturers from implementing the actions above and, therefore, from following the norms. The key challenges include:

  1. High costs of required measures
    Cybersecurity measures and, in particular, adoption of stricter secure software development practices, require expertise and time. For small and medium enterprises (SMEs), often operating with limited budgets and general IT personnel responsible for all ICT related processes, this can be a tough challenge to meet.
  2. Complexity and lack of expertise
    The lack of expertise in cybersecurity poses a significant challenge for all organisations when it comes to investing more in their security measures. Implementing effective cybersecurity protocols requires specialised knowledge and skills. This especially affects small companies, as they may not have access to skilled cybersecurity professionals, or find it financially challenging to hire external experts. As a result, they may be hesitant to invest in cybersecurity measures they feel ill-equipped to handle, such as creating and maintaining a vulnerability disclosure program.
    Furthermore, successful cybersecurity implementation involves robust asset management, which allows organisations to identify vulnerabilities before they can be exploited. While small organisations with limited resources may effectively manage their assets, medium enterprises might already find it difficult to do so. As organisations grow larger, the task of keeping track of all assets becomes near-impossible.
    Additionally, many organisations integrate open-source software (OSS) into their systems without fully understanding the potential consequences and risks associated with using code developed outside their organisation. The challenge also lies in having the proper skills and knowledge to conduct necessary security assessments of such components. The one-fits-all approach with centralised assessments can hardly be implemented – security risks are contextualised, as Geneva Dialogue experts noted several times, and manufacturers should rely on the knowledge of their systems and landscape for such security reviews to identify which components could be trusted and would be reliable.
    It should be also noted that the lack of expertise in cybersecurity is a universal challenge, which even bigger, more-resourced, organisations may face.
  3. Lack or low awareness of business justification and rationale to implement required security measures
    The Geneva Dialogue experts have shared the widespread issue in many industries which is the difficulty of translating the technical language of vulnerabilities and their impact into terms that CEOs and decision-makers can understand and relate to their business objectives.
    One contributing factor to this challenge is that some companies may underestimate the risk of a cybersecurity breach occurring within their organisation, particularly if they haven’t experienced such an incident in the past. This perception of low risk can lead to complacency, where companies become less inclined to invest in cybersecurity until they encounter a breach, or face regulatory pressure.
    Another aspect is the lack of immediate tangible returns from cybersecurity investments. Unlike investments in product development or marketing, the benefits of cybersecurity may not be immediately apparent. This can make it challenging for some companies to justify the costs for cybersecurity, as they may prioritise activities that yield more immediate revenue. Moreover, investing in cybersecurity involves diverting financial resources from other areas of the business.
  4. Lack of international cooperation and the complex regulatory and policy landscape 5“States, reaffirming General Assembly resolution 70/237 and acknowledging General Assembly resolution 73/27, should: take reasonable steps to ensure the integrity of the supply chain, including through the development of objective cooperative measures, so that end users can have confidence in the security of ICT products; seek to prevent the proliferation of malicious ICT tools and techniques and the use of harmful hidden functions; and encourage the responsible reporting of vulnerabilities”, OEWG Final Substantive Report 2021, para 28 https://dig.watch/resource/oewg-2021-report The lack of international cooperation in setting cybersecurity standards and practices leads to inconsistencies in regulations across different countries and regions. This creates a challenging environment for organisations that operate globally, or have customers and partners in multiple jurisdictions. Adhering to varying cybersecurity requirements can be time-consuming, costly, and logistically demanding.
    At the same time, the constantly evolving and complex regulatory landscape creates uncertainty for organisations. The lack of clarity on future regulations and requirements makes it challenging for companies to plan and allocate resources effectively. This uncertainty can discourage investments in cybersecurity, as companies may hesitate to commit significant resources to initiatives that may become obsolete or non-compliant in the future.
  5. Difficulties to certify and/or conduct a security assessment of a digital product entirely due to the complexity of software composition and use of third-party components
    To accelerate development and reduce costs, manufacturers often integrate third-party components and libraries into their products. While these components can provide valuable functionality, they also introduce potential security risks. Manufacturers may have limited visibility and control over the security practices of third-party vendors, making it difficult to ensure the overall security of the product.
    At the same time, the lack of standardised and comprehensive certification processes for digital products poses a challenge. Unlike industries with well-established certification frameworks (e.g. safety certifications for physical products), the certification of digital products’ security is often less standardised and more complex. The absence of clear guidelines can make it difficult for manufacturers to determine what security measures are necessary, and what level of security should be achieved.
  6. The limitations of technical approaches to address trust issues related to ICT supply chain security
    These limitations include issues related to trust in suppliers and considerations surrounding the country of origin of the various components used in the product. The technical community and industry are aware of these challenges and recognise the necessity for criteria that encompass both political and technical factors. They see the potential for creating globally interoperable criteria that can effectively evaluate and mitigate supply chain risks. However, creating globally interoperable criteria that effectively address these multifaceted concerns is a challenging task that requires trust and political will from various stakeholders, including governments. 6“States, reaffirming General Assembly resolution 70/237 and acknowledging General Assembly resolution 73/27, should: take reasonable steps to ensure the integrity of the supply chain, including through the development of objective cooperative measures, so that end users can have confidence in the security of ICT products; seek to prevent the proliferation of malicious ICT tools and techniques and the use of harmful hidden functions; and encourage the responsible reporting of vulnerabilities”, OEWG Final Substantive Report 2021, para 28 https://dig.watch/resource/oewg-2021-report
  7. The emerging trend of governments mandating vulnerability reporting directly to them, rather than to the vendors
    While the intention behind these regulations may be to enhance cybersecurity and create a centralised repository of vulnerabilities, there are inherent risks involved. Collecting all vulnerabilities from various companies into a government database raises concerns about the security and confidentiality of such sensitive information. The potential for data breaches or unauthorised access to such a database could expose critical vulnerabilities, putting not only the companies at risk, but also the users of their products.
    Another challenge lies in the lack of trust between the private and public sectors. Manufacturers may be hesitant to report vulnerabilities, particularly those which are not patched yet, directly to the government, fearing that the information could be mishandled, misused, or not adequately addressed. This lack of trust can lead to underreporting of vulnerabilities, leaving potential security loopholes unaddressed.
    Moreover, governments’ involvement in vulnerability evaluation and reporting can be influenced by self-interest and national security concerns. In some cases, there may be a tendency to prioritise certain vulnerabilities over others based on national interests, 7GCSC Norm #3 to Avoid Tampering:
    “[…] the norm prohibits tampering with a product or service line, which puts the stability of cyberspace at risk. This norm would not prohibit targeted state action that poses little risk to the overall stability of cyberspace; for example, the targeted interception and tampering of a limited number of end-user devices in order to facilitate military espionage or criminal investigations.“
    https://hcss.nl/gcsc-norms/
    potentially leading to the non-disclosure of critical vulnerabilities that affect the security of digital products.

Please note that these practices, also further in the Geneva Manual, are not exhaustive and that the Geneva Dialogue will continue including more good practices to inspire others in the international community to implement the norms.

  • Emerging cybersecurity regulations should avoid requirements to mandate reporting of unpatched vulnerabilities to anyone else but a code owner to minimise the risks of accessing this information by malicious actors. Where code owners do not cooperate, governments can play a role by putting pressure on such vendors to participate in responsible vulnerability disclosure
  • Governments need to enhance transparency about their vulnerability equities processes (VEP) or government disclosure decision processes. 8GCSC Norm #5 for States to Create a Vulnerabilities Equities Process: “[…] Given that vulnerability discovery and disclosure is broader than any one state, in order to promote network resilience while at the same time safeguarding national security, it would be in the interest of the long-term stability of cyberspace for every state to have such a process in place. Additionally, states should work towards compatible and predictable processes. The existence of such processes can act as a confidence-building measure between states in that it provides some assurance that relevant equities and competing interests are fully considered. Of course, every state has differing capabilities and unique interagency structures, however, any effective VEP process should be designed to take a broad range of perspectives and equities into account. In addition, though the actual decisions reached in individual cases may, out of necessity, remain confidential, there should be transparency on the general procedures and framework for reaching such decisions. Finally, this norm deals only with the establishment of a process where disclosure decisions are made. If a government or any other entity decides to make a disclosure, such disclosure should be made in a responsible manner that promotes public safety and does not lead to exploitation of that vulnerability.”
    https://hcss.nl/gcsc-norms/
    This would include making the information about the scope, involved government agencies, principles that guide the government decision-making in responsible vulnerability disclosure, and oversight mechanisms public. Such measures can help boost trust across the private sector and research community to cooperate with governments in responsible vulnerability disclosure
  • New regulations concerning digital product security should avoid the one-size-fits-all approach and, instead, tailor their requirements to the unique characteristics of each product category, such as cloud services and IoT devices, taking into account their distinct use cases, processes, and data handling practices
  • Governments need to step in to create better incentive programs for organisations to invest more in security of digital products (e.g. with the help of insurance companies)
  • A neutral and geopolitics-free governance framework is required to globally approach the security of ICT supply chains and security of digital products. 9“States, reaffirming General Assembly resolution 70/237 and acknowledging General Assembly resolution 73/27, should: take reasonable steps to ensure the integrity of the supply chain, including through the development of objective cooperative measures, so that end users can have confidence in the security of ICT products; seek to prevent the proliferation of malicious ICT tools and techniques and the use of harmful hidden functions; and encourage the responsible reporting of vulnerabilities”, OEWG Final Substantive Report 2021, para 28
    https://dig.watch/resource/oewg-2021-report
    Many organisations, as the Geneva Dialogue partners emphasised, need fact-based security assessments of technology, software, and suppliers to reduce security risks
  • The implementation of both norms and, particularly, efforts to address interconnected supply chain risks, require stronger international cooperation. Manufacturers and the private sector actors should be encouraged to participate more in such international discussions, including in the activities of the standardisation bodies and other industry international or regional processes
  • Addressing the certification challenges in complex multi-component digital products requires a multifaceted approach, including industry-wide collaboration, standardised certification processes, and a commitment to prioritising security throughout the product development lifecycle 10Paris Call Principle #6 on Lifecycle Security: “Strengthen the security of digital processes, products and services, throughout their lifecycle and supply chain”.
    https://pariscall.international/en/principles
    Charter of Trust Principle #2 on Responsibility throughout the digital supply chain and Principle #3 on Security by default
    https://www.charteroftrust.com/wp-content/uploads/2023/07/Charter-of-Trust_Principles_EN_2023-07-25.pdf

The Geneva Dialogue experts have emphasised the necessity for more targeted discussions to precisely specify which of the aforementioned steps (or additional ones) are applicable to small and medium-sized organisations. They also highlighted the importance of supporting these organisations, considering their limited resources, in adopting security practices. The question of how to provide such support and tailor it more effectively, especially for organisations within the ICT supply chain so they do not pose a cybersecurity risk, remains an open consideration.

Furthermore, there is a recognised need for a more detailed analysis of ‘sub-roles’ within the manufacturing sector, acknowledging that different sectors, such as telecom or banking, may be subject to varying degrees of regulation and, consequently, differing responsibilities.

Addressing the challenge of incentivising manufacturers to invest in cybersecurity during the development of their digital products is complex. While regulation is not always necessary, it relies on customer behaviour and their security demands. The Geneva Dialogue experts, particularly those from the private sector and industry, have expressed their expectations for regulators to play a role in promoting a cybersecurity culture through a ‘whole-of-government’ and ‘whole-of-society’ approach. This involves measures such as ensuring and promoting standards for vulnerability exchange, developing government vulnerability disclosure policies, ensuring transparency in how authorities handle vulnerabilities responsibly, and setting a precedent by implementing these norms and cooperating with relevant non-state stakeholders.

However, while the Geneva Dialogue experts expressed a desire for a global, neutral, and geopolitics-free governance framework to secure ICT supply chains and digital products, it remains unclear if such a framework can be established at all, given the also growing fragmentation in regulatory efforts across countries. Therefore, tackling the implementation of the norm, specifically 13i, and addressing risks associated with ICT supply chain security in today’s context, marked by increasing polarisation and technological competition between jurisdictions, poses a challenge that necessitates international approaches such as the Geneva Dialogue.

 

Role: Open-source software (OSS) community

If you were the owner of an open-source tool where the vulnerability had been discovered, what actions would you take to minimise the security risks? What difficulties may you encounter in taking such actions?

The role refers to an individual, or a group of individuals, who contribute to the development, improvement, and maintenance of OSS projects. This includes the code owners, as well as repositories and organisations that maintain them. OSS refers to software whose source code is made freely available to the public, allowing anyone to view, modify, and distribute the code. OSS contributors, developers and maintainers are used interchangeably in the Geneva Manual.

Technical community

Given the wide adoption of OSS in modern ICT products (e.g. 97% of applications leverage open-source code, and 90% of companies are applying or using it in some way according to GitHub) and recently discovered critical vulnerabilities (e.g. Log4Shell), the Geneva Dialogue experts have singled out the open-source community and developers. They have recognised the professional and ethical responsibility of the OSS community to produce as much secure software as possible (and they are expected to follow relative OSS foundation guidelines), but not the legal responsibility to do so. Since the OSS developers and maintainers may not have the resources and capacities to meet all security requirements (and in most cases they work on voluntary basis), the Geneva Dialogue experts emphasised the importance of collaboration between the private sector and the OSS community, as well as mutual support in this regard.

The Geneva Dialogue experts added that OSS developers and maintainers may need to consider commoditising or making free security assessment tools to uplift the code quality as well as security. In this regard, the role of repositories has been specifically highlighted – they can help OSS contributors with the adoption of security practices for code development as well as support them with vulnerability reporting concerning their repositories.

Some of the incentives for the OSS community to adopt stricter security practices, as well as to follow the two cyber norms, include:

  1. Community reputation
    By prioritising security, OSS developers can build a reputation for producing reliable and secure software, which enhances trust among community contributors and users.
  2. Personal and professional growth
    By following security practices, OSS developers can make more valuable contributions to software development, thus enhancing their career prospects.

OSS developers face several challenges in producing more secure code One of the main challenges are the unrealistic expectations often placed on OSS developers, under the assumption that they have the same level of resources as closed-source companies. However, there are certain key differences between the two that impact the way security is handled:

  1. Lack of contractual obligations
    Closed-source companies typically have contractual obligations with their customers or users, which may include service level agreements (SLAs) specifying response times and actions in case of security incidents. In contrast, OSS maintainers often work on a voluntary or a community-driven basis, and they may not have the same contractual obligations. This lack of formal obligations can make it difficult to meet specific response times or take immediate actions as expected.
  2. Limited resources for regular testing of software components
    Open-source projects, especially smaller ones, may have limited resources, including capacity and funding. Unlike proprietary software  companies that may have dedicated teams for security, open-source developers might not have the same level of resources available to focus solely on security-related tasks. In most cases, OSS code-owners are developing and maintaining the code on a voluntary basis.
  3. Complexity of a community-driven development and multiple collaborators
    OSS is often developed collaboratively by a community of contributors, each with their own priorities and areas of expertise. Coordinating and aligning the efforts of various contributors towards security goals can be challenging.
  4. Time and prioritisation
    OSS developers often contribute to projects in their spare time or as part of their other responsibilities. Balancing security efforts with other tasks and commitments can impact the time and priority given to addressing security concerns.
  5. Dependency chain risks
    Open-source projects may rely on other open-source components or libraries. Ensuring the security of the entire dependency chain can be a complex task, especially if some of the components lack proper security scrutiny.
  6. Lack of incentives
    In some cases, OSS developers may not receive financial incentives or direct rewards for investing time and effort in security improvements. This can demotivate some of the developers from prioritising security over other aspects of the project.
  • Security incidents in open-source projects can erode trust in the broader OSS community and impact the reputation of digital products built upon these projects. However, the open-source projects play a crucial role in fostering technological innovation by providing cost-efficiency, interoperability and inclusivity for developers, regardless of their geographic location or organisational affiliation
  • To address the security challenges, open-source communities should prioritise security, implement good practices, provide educational resources, and establish effective processes for vulnerability management and patching. Increased collaboration between open-source projects, industry, and the broader cybersecurity community can also contribute to enhancing the security of OSS
  • In particular, open source projects need to consider incorporating cybersecurity attestations into standard licences. This would foster the requirement for OSS developers and maintainers to adhere to minimum cybersecurity due diligence for committed code. These attestations could encompass the use of a standardised cybersecurity assurance pipeline, such as SAST and DAST, to assess the suitability of check-in code. Additionally, OSS developers and maintainers might have a minimum obligation in supporting vulnerability remediation
  • Larger organisations need to support the OSS community to develop more secure software

Embracing more security in OSS development while not disincentivising contributors is critical and requires a more creative approach, as the Geneva Dialogue experts noted, such as support from private companies, industry, and the cybersecurity community. Introducing the legislation to regulate the security in OSS is a challenge due to several reasons, and various members of the OSS community, including individual developers and open-source foundations, have already raised concerns about the proposed cybersecurity legislation in Europe – the Cyber Resilience Act (CRA).

One of the challenges includes a lack of comprehensive knowledge for OSS developers, whether independent individuals or nonprofit foundations, about all users due to the freely distributed nature of their software. That’s why implementing vulnerability remediation and issuing security patches to downstream users may be a challenge, especially for those providing software for free. True, while at the same time, communities such as GitHub make steps to support contributors (see good practices above).

A lack of comprehensive knowledge about the users of the software also highlights a challenge to manage and keep track of external libraries and dependencies (what may also be difficult for organisations and their in-house proprietary code).

In the meantime, with particular regard to the CRA which may transform the software development industry, open-source foundations offer support to OSS developers by raising their awareness of a possible impact once the law is adopted and currently available ways to influence the policy-making.

Artificial intelligence (AI) already assists developers to compose new code. On the one hand, this may allow less skilled individuals to produce their own code, and ‘democratise’ code-development; this may, however, lead to even more wide-spread vulnerabilities. On the other hand, AI can help identify common vulnerabilities in widespread open source code, and ultimately write a more secure code. There is a need for more efforts to apply AI solutions for the future.

 

Role: Organisational customers of digital products/ICTs

As a customer and user of digital products, what would you expect from your suppliers? What would motivate you to keep trusting them?

The role refers to any organisation that procures, purchases, manages, and utilises digital products/ICTs for their own use, including to provide services based on such digital products/ICTs to their own customers and end-users.

This role includes, but is not limited to, critical infrastructure entities, small and medium organisations, but also other entities from the public and private sectors that provide digital products and services to citizen customers.

  • The private sector
  • Academia
  • Civil society
  • Technical community

While such organisations may not be directly involved in developing digital products or be responsible for the security of the products they purchase, they do have responsibility to implement the two cyber norms. In particular, the Geneva Dialogue experts emphasised that while these organisations may not be the creators of digital products, they are still accountable for the security and safety risks associated with the services they provide if these services rely on ICTs from third-party vendors. If a critical vulnerability is discovered in the ICTs used by these organisations, they may be even held liable for negative security and safety consequences that arise as a result.

In various sectors and industries, many organisations are subject to specific regulations and laws that govern their operations concerning cybersecurity and data protection. For instance, critical infrastructure protection laws may apply to organisations that operate vital infrastructures like energy, transportation, or healthcare systems. Additionally, regulations related to personal data protection impose responsibilities on organisations that handle sensitive information.

By complying with the existing sector-specific laws and regulations, organisations can better ensure the security of their operations and the safety of their customers and users, and thus be able to implement the two norms. In particular, the following set of responsibilities, that primarily citizen customers expect from organisational customers,  has been outlined in the Geneva Dialogue:

  1. Conducting vendor evaluation and selection before making procurement decisions and assessing the security practices of potential vendors
  2. Including security requirements in contracts to outline security standards, data protection measures, incident response protocols, and other provisions as identified by applicable rules and laws
  3. Conducting regular security audits of digital products and services that have been already procured to identify vulnerabilities and any other potential security risks, requesting the information about the composition of digital products and services (e.g. SBOM documentation)
  4. Ensuring compliance with applicable laws and regulations
  5. Conducting ongoing vendor management to monitor the security performance of technology providers and establishing regular communication channels with vendors to address security concerns
  6. Minimising human-related security risks and investing in user education and awareness to educate their employees and users about the proper use of digital products and services
  7. Conducting vulnerability management to ensure that all ICT systems and software are regularly updated with the latest security patches and updates 11Paris Call Principle #5 on Non-proliferation: “Develop ways to prevent the proliferation of malicious software and practices intended to cause harm”
    https://pariscall.international/en/principles
    GCSC Norm #6 to Reduce and Mitigate Significant Vulnerabilities: “[…] While it is currently very difficult to ensure that no vulnerabilities exist in newly released or updated products, rather, this proposed norm suggests that those involved in the development or production of such products take “reasonable steps” that would reduce the frequency and severity of those that do occur.[…] this proposed norm seeks to have those who develop or produce critical products take reasonable measures to ensure that the number and scope of critical vulnerabilities are minimized and that they are effectively and timely mitigated and, when appropriate, disclosed when discovered. The process used should be transparent to create a predictable and stable environment.”
    https://hcss.nl/gcsc-norms/
    (this represents researchers’ expectations from organisational customers, as well)
  8. Ensuring the secure integration of ICT systems, with the help of vendors or any other relevant parties
  9. Ensuring data security and, in particular, undressing how vendors handle and protect sensitive data, and ensuring compliance with relevant data protection regulations 12Charter of Trust Principle #3 on Security by Default: “[Companies should] adopt the highest appropriate level of security and data protection and ensure that it’s preconfigured into the design of products, functionalities, processes, technologies, operations, architectures, and business models”.
    https://www.charteroftrust.com/wp-content/uploads/2023/07/Charter-of-Trust_Principles_EN_2023-07-25.pdf
  10. Building incident response plans and collaborating with vendors to establish clear procedures to minimise and mitigate security risks and impact of any potential breaches
  11. Ensuring continuous improvement and cyber-resilience planning, including regular reassessment of security needs and staying informed (including C-level management) about emerging security trends

Besides the obvious cybersecurity and data protection regulatory incentives for certain industries and sectors to implement the security measures above and thus follow the two norms, the Geneva Dialogue experts have outlined the following:

  1. Security requirements set by stakeholders, partners, investors and donors, and, therefore, reputation and trust from customers, stakeholders, partners, investors or donors. Misuse of personal data or security breaches revealing poor security practices can hit not only with potential fines and legal consequences, but affect the organisation’s reputation.
  2. Intellectual property protection with the help of stricter cybersecurity measures.
  3. Potential third-party risks. Since such organisations are not directly involved in software development but do largely rely on ICTs, they operate with inherent risks stemming from third party suppliers. This forces organisers to adopt stricter cybersecurity rules and, as a result, contribute to the implementation of the two norms.

Considering that such organisations may cover a wide range of entities – from schools and bakeries, to airports – the Geneva Dialogue experts have outlined a broad list of possible difficulties that may slow down organisations’ contribution to the implementation of these two norms:

  1. Budget constraints and limited expertise (or lack of such expertise at all) to particularly conduct regular security audits of external solutions, including services from cloud providers
  2. The unwillingness of infrastructure owners/operators to change legacy systems and infrastructure which may lack built-in security features or may not be compatible with the latest security updates. In any case, such systems require expertise, which organisations may lack, or require more time for
  3. Lack or low awareness of business justification and rationale to implement required security measures (the same difficulty as for the manufacturers and/or suppliers of ICTs).
  4. Constantly evolving threat landscape that makes it challenging for organisations to keep up with the latest security measures and practices
  • The NIS2 Directive as an example of the legislation that establishes the cybersecurity risk management measures for entities in scope to protect network and information systems
  • UK NCSC Supply chain security guidance as a resource designed to assist organisations in managing supply chain risks and choosing trusted ICT suppliers
  • ATT&CK Matrix for Enterprise, MITRE ATT&CK® as a knowledge base of adversary tactics and techniques to support organisations in public and private sectors in conducting their threat assessments
  • The EU 5G Toolbox which addresses the risks related to non-technical (such as the risk of interference from non-EU state or state-backed actors through the 5G supply chain) and technical factors, and thus is designed to support organisations in public and private sectors
  • Software Bill of Materials (SBOM) and Hardware Bill of Materials (HBOM) as an example of the security document to request from ICT manufacturers/suppliers and use for evaluating the security and reliability of digital products
  • Singapore Cybersecurity Labelling Scheme (CLS) and the separate scheme for medical devices as an approach to enhance the security of consumer Internet of Things (IoT) devices and support consumers in making security-informed purchases
  • Singapore Common Criteria Scheme is established to support the info-communications industry with means to evaluate and certify their IT products against the CC standard in Singapore
  • Customers, especially large organisations, should demand SBOM/HBOM documentation from ICT manufacturers in order to ensure their security practices and, at the same time, incentivise the adoption of automated processes. Organisations in the public sector also need to step in and require SBOM/HBOM documentation for their security assessments (and, for instance, incorporate these requirements in their procurement policies)
  • For customers with limited resources, but yet a necessity to ensure the cybersecurity of their own processes and operations, ICT manufacturers and supplies should provide, where possible, results of third-party security assessments 13Charter of Trust Principle #7 on Cyber-resilience through conformity and certification
    https://www.charteroftrust.com/wp-content/uploads/2023/07/Charter-of-Trust_Principles_EN_2023-07-25.pdf
    (e.g. security certifications based on known industry standards) to regularly prove the security of their solutions and help customers make informed decisions

 

Role: Cybersecurity researchers

Do researchers – when discovering the vulnerability – always have to coordinate actions with vendors? Authorities? To whom would the reporting of vulnerabilities be considered as ‘responsible’ following the norm 13j?

Can (and should?) cybersecurity researchers independently mitigate the exploitation of the vulnerability without notifying the manufacturer? Or national authorities?

The role of a cybersecurity researcher refers to a professional who specialises in exploring and analysing various aspects of cybersecurity to identify vulnerabilities, threats, and potential risks in digital systems, software, and networks.

  • Technical community
  • The private sector (in those cases where researchers represent a company)
  • Academia
The Geneva Dialogue experts agreed that the primary role of a researcher is to find and disclose vulnerabilities, but is not expected to find a comprehensive solution to the entire security problem. Researchers are expected to follow certain ethical and security guidelines and, in particular, always report discovered vulnerabilities to code owners and choose secure communication channels for doing so. Where needed, researchers should consider engaging appropriate authorities such as CERTs/CSIRTs to ensure the coordination in vulnerability disclosure.

Researchers also play an important role in providing threat intelligence and assistance in investigation of supply chain threats. However, their reporting and research can be influenced by business incentives, profit-driven motives, as well as geopolitics, and thus lack independence and impartiality.

The discussion with the Geneva Dialogue experts allowed to specifically outline the actions which cybersecurity researchers should avoid in order to contribute to the implementation of the two UN GGE norms:

  1. Publicly disclosing vulnerabilities without first notifying the affected vendors or relevant authorities. Responsible vulnerability disclosure would involve giving vendors a reasonable amount of time to address and patch the vulnerability before making them publicly known.
    What is an appropriate threshold for a vendor to respond, where a vulnerability has been discovered by a researcher? To this question, a group of experts agreed that first it’s important to define the criticality of the discovered vulnerability (e.g. whether the vulnerability affects the national critical infrastructure). If it does, further considerations come into play, such as whether it is cross-jurisdictional or localised, and if the researcher is in the same place as the vulnerability or not. The research community sticks to a 90-day maximum threshold to wait for a vendor to release a fix to the vulnerability reported if the vendor doesn’t have an established timeline in their security policy.
  2. Exploiting or misusing discovered vulnerabilities. Researchers should refrain from exploiting or misusing the vulnerabilities they discover for personal gain, malicious intent, or any other unauthorised purpose. 14Paris Call Principle #5 on Non-proliferation: “Develop ways to prevent the proliferation of malicious software and practices intended to cause harm”
    https://pariscall.international/en/principles
    A group of experts stated that exploiting vulnerabilities for commercial gain should be prohibited.
  3. Engaging in unauthorised access. Researchers must not engage in unauthorised access or unauthorised activities while investigating vulnerabilities.
  4. Demanding payment or engaging in extortion tactics in exchange for information about vulnerabilities. The Geneva Dialogue experts agreed that such actions are unethical and can be illegal, depending on the jurisdiction.
  5. Publicly shaming vendors for their response to vulnerability disclosures or ignoring vendor disclosure policies. Researchers should focus on constructive engagement and collaboration to resolve the security issues. The Geneva Dialogue experts discussed possible actions for the researchers in situations where a code owner (i.e. manufacturer) does not respond, or responds too slowly. Some experts shared that researchers could find themselves in a difficult situation: either to wait for a response and further action from a manufacturer, or to act further without it, if risks are too high for the users. Maintaining good working relationships with vendors is important, however, as the experts agreed, the ultimate goal is to have the vulnerabilities fixed and enhance the security posture for all users. Therefore, researchers are expected to respect the vendor’s policies where possible, and should consider the context and specifics of the vulnerability (as well as any other factors) in order to minimise the risks.Ignoring legal and regulatory considerations and, in particular, neglecting end-user safety and privacy. At the same time, it should be noted that legal requirements can be somewhat vague for researchers, or too complex to grasp, depending on a jurisdiction.
  6. Engaging in dual-use research and disrupting or damaging systems which would result in negative impacts on the availability or functionality of the target systems. In this context, the Geneva Dialogue experts particularly discussed the complex question of legitimacy in vulnerability research, especially if critical systems or legacy platforms are involved. With respect to such life-critical systems, the experts highlighted that testing or finding vulnerabilities should be approached with extreme caution due to the high risks involved.
  7. Ignoring legal and regulatory considerations and, in particular, neglecting end-user safety and privacy. At the same time, it should be noted that legal requirements can be somewhat vague for researchers, or too complex to grasp, depending on a jurisdiction.

There are several incentives driving researchers to implement responsible vulnerability disclosure and, specifically, to implement, at least, the norm 13j:

  1. Clear reporting channels, clear processes and terms for coordination with vendors in vulnerability disclosure
  2. Initiatives and programmes offering legal protections and encouraging ethical vulnerability research and disclosure (e.g. safe harbours)
  3. Access to pre-release software or early access to security updates for vulnerability testing
  4. Publicity, recognition, and acknowledgement, including with monetary regards (e.g. bug bounty programs)
  5. Decriminalisation of ethical vulnerability research, and disclosure and exemption from prosecution for those who ensure proper authorisation and compliance with responsible disclosure guidelines
  6. Institutional support where governments, through laws and programs, require other organisations to promote and support responsible vulnerability disclosure (e.g. as an element in national cybersecurity strategy, or any other relevant national laws and rules)
  7. Research grants and funding by governments to support cybersecurity research initiatives
  8. Public–private collaboration between governments, including law enforcement agencies, private companies, and researchers

Researchers may face several demotivating factors and obstacles that prevent them from conducting responsible vulnerability disclosure:

  1. Legal barriers and a complex regulatory environment which can create uncertainties and deter researchers from engaging responsible with other parties
  2. Fear of criminalisation as well as difficulty distinguishing harmless actions from malicious intent
  3. Lack of clarity in vendors’ policies and terms, including in legal protections
  4. Limited knowledge about vulnerability impact. Researchers may not always know if a vulnerability is present across multiple products or affects open-source libraries, making it challenging to assess  the potential impact and determine the appropriate level of care for reporting
  5. Lack of a supportive legal and institutional environment to nurture the conditions for the welcomed and in-demand vulnerability research across different organisations
  6. Complexity of supply chain disclosures: coordinating efforts and sharing vulnerabilities across supply chains can be challenging due to legal and logistical complexities. Such complexity may discourage researchers from engaging in cross-jurisdictional vulnerability disclosure

The Geneva Dialogue experts highlighted that the limitations of data and research provided by private actors should be acknowledged, as their interests may introduce biases or lack impartiality. To counter this, independent institutions or bodies, such as those from academia or civil society, can step in to mitigate these risks. By providing impartial monitoring and research, these independent organisations can contribute to creating a framework for reducing vulnerabilities and promoting stronger security in ICTs.

To address these demotivating factors and encourage responsible vulnerability disclosure, there is a need for legal reforms that decriminalise vulnerability reporting and provide clear protections for researchers. Establishing supportive legal frameworks that focus on encouraging responsible reporting without malicious intent can foster a more welcoming environment for researchers to come forward with their findings. Additionally, enhancing collaboration between researchers, vendors, and authorities can help address the challenges associated with cross-jurisdictional vulnerability disclosure and supply chain security.

 

Role: Civil society engaged in advocacy, research, and training

The role refers to a non-governmental organisation (NGO) or academia or policy institution, or to individuals who serve as intermediaries between users of digital products and decision-makers (including from both the private and public sector) to shape policies as well as influence public opinion on issues relevant to their mission. Such organisations can also engage in capacity building to help educate decision-makers, as well as users, about issues related to the security of digital products, safety for users, and other topics related to the implementation of the two UN GGE norms.

  • Civil society
  • Academia

The Geneva Dialogue experts have particularly noted that civil society needs to be more involved in policy development to help manufacturers and policymakers inter alia ensure that the rights of users are respected, to better consider the role of consumers and users in the security process, and also help define criteria for a trustworthy technology.

The Geneva Dialogue experts highlighted the role of civil society and academia in advancing the implementation of  the two UN GGE norms. In particular:

  1. Driving policy and institutional changes, e.g. by requiring greater transparency in vulnerability handling from companies and governments, or by driving cybersecurity labels for digital products. The Geneva Dialogue experts have particularly noted that civil society needs to be more involved in policy development to help manufacturers and other stakeholders better consider the role of organisational and citizen customers and users in the security process, and also help define criteria for a trustworthy technology 15Paris Call Principle #5 on Non-proliferation: “Develop ways to prevent the proliferation of malicious software and practices intended to cause harm”
    https://pariscall.international/en/principles
    . Civil society and academia can also help address trust issues related to ICT supply chain security (and implementation of the norm 13i) by providing a corresponding framework and tools to governments and ICT manufacturers/suppliers.
  2. Training and capacity building for bridging the gap between policymakers and technical experts to help decision-makers (e.g. between the government and the private sector) to speak the same language and specifically translate technical terms into national policies aligned with the UN cyber-stability framework. Academia can also support governments in building harmonised interpretation of the norms and framework across different jurisdictions.
  3. Creating pressure on decision-makers to prohibit the commercial exploitation of vulnerabilities.
  4. Facilitating international collaboration and information sharing between researchers, industries, and governments to minimise the risks stemming from the exploitation of ICT vulnerabilities.
  5. Measuring impact and effectiveness of initiatives, policies, and laws related to enhancing the security of digital products by conducting research to help refine and improve implementation strategies.
  6. Representing organisational and citizen customers and users to incentivise companies to improve the security in their products and processes.
  7. Teaming up with the private sector to help educate users about their possible role in ensuring the security of digital products.

Both academia and civil society organisations can be motivated to call for the implementation of the two UN GGE norms due to the following reasons:

  1. Protecting users from security and safety risks, as well as users’ data protection rights through stronger security in digital products
  2. Advancing research and knowledge for academia
  3. Addressing cybersecurity challenges with a global impact

Several key challenges can prevent both civil society and academia from calling for stronger security in digital products, reducing ICT vulnerabilities, and thus implementing the two UN GGE norms:

  1. Lack of technical expertise and limited access to industry data and insights
    Even though both academia and civil society can greatly help in capacity building to bridge the gap between different stakeholders, they themselves may lack necessary knowledge and expertise about nuances in vulnerability disclosure and security ICT supply chains.
    Furthermore, both civil society organisations and academia may face challenges in accessing proprietary information and industry insights. As a result, without access to comprehensive data related to digital product security and supply chain risks, they may find it difficult to make informed and evidence-based calls to represent users’ interests.
  2. Influence of corporate interests
    Interested stakeholders and their lobbying efforts in shaping policy and regulations can impact the ability of civil society organisations and academia to advocate for stronger security in digital products.
  • Swiss Digital Initiative as an example of the effort to bring ethical principles and values into technologies and urge organisations to ensure trustworthy digital services for end-users
  • Global Encryption Coalition as an example of an international effort initiated by the Center for Democracy and Technology, Global Partners Digital, Mozilla Corporation, Internet Society, and the Internet Freedom Foundation to promote and defend encryption in ICTs
  • Cyber Incident Tracer by CyberPeace Institute as the platform to bridge the information gap about cyberattacks on the healthcare sector and their impact on people
  • Geneva Declaration on Targeted Surveillance and Human Rights initiated by AccessNow, the Government of Catalonia, the private sector, and civil society organisations to implement an immediate moratorium on the export, sale, transfer, servicing, and use of targeted digital surveillance technologies until rigorous human rights safeguards are put in place

Civil society and academia often lack comprehensive industry data and insights (as mentioned above), including visibility into the supply chains of private companies and organisations from the public sector. Access to such information is usually restricted, but may be critical for building expertise across civil society and academia organisations so they can conduct high-quality research and analysis.

At the same time, despite the lack of technical expertise, it is the civil society and academia who can help raise uneasy questions (e.g. human rights impacts from the exploitation of ICT vulnerabilities by both private and state actors) or build much-needed trust among different stakeholders for ICT supply chain security and responsible reporting of ICT vulnerabilities (thus implementation of both norms) by promoting an international dialogue and collaborative approach.

It is important to note that the Geneva Dialogue experts have recognised that each of the listed stakeholders has many sub-groups that might have additional specific roles and responsibilities. For instance, manufacturers include producers of software and hardware, as well as service operators of the cloud or telecommunication infrastructure, while civil society includes advocacy groups, grassroot organisations, think-tank and educational institutions. In addition, there may be a need to elaborate on roles of responsibilities of additional stakeholder groups, such as the standardisation community. Such discussion may be part of the future work of the Geneva Dialogue, towards the next edition of the Geneva Manual.

Separately, the Geneva Dialogue experts have discussed expectations from states and regional organisations and highlighted their role in coordinating efforts with other states to ensure the ICT supply chain security (given ICT supply chains are global and cross-border) as well as in addressing security issues in digital products with efficient legal framework and policies:

  • Codifying the norms and promoting responsible behaviour norms should be translated into clear regulatory expectations, though this can be very challenging given the complex nature of ICT supply chains. The clear interoperable security criteria for testing and security assessments are needed to address both technical and political concerns these days
  • However, even if such regulatory frameworks emerge, the challenge is to ensure the adoption of cybersecurity recommendations across organisations, especially across small and medium companies. While guidelines may be published to mitigate supply chain vulnerabilities and reduce risks, it remains unclear how to ensure that organisations actually follow these recommendations
  • In the context of OSS, government bodies could step in coordinating efforts between manufacturers, open-source community, and other relevant parties, sharing information, and leveraging international collaborations to address cybersecurity threats and support their respective countries in times of crisis
  • The national governments’ ability to communicate and collaborate with other states is considered crucial in effectively addressing cybersecurity challenges, as well
  • States are also expected to encourage responsible reporting of ICT vulnerabilities, recognise their exploitation as a threat, increase transparency about stockpiling of ICT vulnerabilities (such as through vulnerability equities processes, VEP), and limit commercial exploitation of ICT vulnerabilities 16OSCE Confidence-building measure (CBMs) on #16: “Participating States will, on a voluntary basis, encourage responsible reporting of vulnerabilities affecting the security of and in the use of ICTs and share associated information on available remedies to such vulnerabilities, including with relevant segments of the ICT business and industry, with the goal of increasing co-operation and transparency within the OSCE region. OSCE participating States agree that such information exchange, when occurring between States, should use appropriately authorized and protected communication channels, including the contact points designated in line with CBM 8 of Permanent Council Decision No. 1106, with a view to avoiding duplication”. – https://dig.watch/resource/confidence-building-measures-reduce-risks-conflict-stremming-ict
    GCSC Norm #3 to Avoid Tampering: “[…] the norm prohibits tampering with a product or service line, which puts the stability of cyberspace at risk. This norm would not prohibit targeted state action that poses little risk to the overall stability of cyberspace; for example, the targeted interception and tampering of a limited number of end-user devices in order to facilitate military espionage or criminal investigations.” – https://hcss.nl/gcsc-norms/