Saving open source: The impending tragedy of the Cyber Resilience Act

06b18367327e62397123b2b8f7bf7211.jpeg

57bcb274871124f175cf14ea85ad607e.png

TLDR (If you think this article is too long, you can just read the abstract)

Software, including open source software, is being regulated around the world. This lengthy blog post explains the background, benefits, pitfalls, and possible negative impacts on open source software of the EU Cyber ​​Resilience Act. In addition, it explains the complex process of the bill within the EU system to help people understand the timeline and how to drive change.

Software, including open source, is becoming regulated the world over. This lengthy blog post explains the background to the Cyber Resilience Act in the European Union, what is good, its flaws and the likely negative impact on open source. And it also explains the arcane process by which it moves through the EU system, to help understand the timeline and how to make a change.

Translator’s Note: EU Cyber ​​Resilience Act https://www.european-cyber-resilience-act.com/

If you need a more verbal introduction, Eclipse's Mike Milinkovich provides a very fresh, clear presentation that covers the same ground. If you prefer a short call to action, then try GitHub, CNLL (in French), The Linux Foundation or a wider industry response.

If you are looking for a more verbal introduction – Mike Milinkovich at Eclipse gave a very up-to-date and lucid presentation that covers the same ground. If you are more into short calls to action – then try GitHub, CNLL (in French), the Linux Foundation, or the more comprehensive response of the wider industry.

Translator's Note:

A very new and clear demonstration Update on the European Cyber ​​Resilience Act: https://www.youtube.com/watch?v=AmsM5_5QO5A Call to action: GitHub, Linux Foundation

  • GitHub:https://github.blog/2023-07-12-no-cyber-resilience-without-open-source-sustainability/

  • Linux Foundation: https://linuxfoundation.eu/cyber-resilience-act

  • Industry response: https://ccianet.org/library/joint-recommendations-for-a-feasible-cyber-resilience-act/

bc76cf49c0858d0e79caef89d03556d4.png

Although the IT industry is still small compared to other large industries and sectors, it has become critical to society over the past few decades. Big events in the software and IT industry are often seen in the news these days. What's more common are stories triggered by some kind of disaster: a misconfiguration, a vulnerability, or criminal and state behavior that simply "got in" too easily. Bad IT practices now also affect major industries, from energy transportation and manufacturing to finance, to democratic processes and well-governed governments.

Although the IT industry is still small compared to other large industries and sectors, over the past decades it has become crucial to society. It is now common to see large events in the software and IT industry in the news. And, more often than not, it’s a story triggered by some sort of disaster: a misconfiguration, bug, or criminals and state actors that “got in” apparently too easily. With poor IT practices now also affecting the major industries, from energy transport to manufacturing to finance to democratic processes and good government.

Because of this, society and various regulatory agencies have certainly noticed this, and therefore, various software management regulations and legislation are being formulated around the world.

Because of this, societies and various governing bodies have certainly taken notice and, as a result, around the world, all sorts of software regulation and legislation are being prepared.

9b1c276c763e499e1d7a621ecfa7b764.jpeg

Taking the history of engineering as an example, this kind of regulation is a perfectly normal outcome. The machinery industry experienced spectacular growth at the end of the 19th century, thanks in part to the invention of the steam engine. However, with the development of this industry, steam boiler explosion accidents have also increased. These accidents often level half a town.

Using engineering history as an example, such regulation is a perfectly normal result. In the late 1800s, the mechanical industry saw incredible growth thanks, in part, to the invention of the steam engine. But as this industry grew; so did the number of accidents with exploding steam boilers. Often flattening half of a given town.

In 1865, the steamship Sultana exploded, killing 1,167 people. Therefore, the American Boiler Manufacturers Association (hereinafter referred to as ABMA) was established and began to self-regulate the industry. It took hundreds of such explosions, and a particularly costly explosion in a Boston shoe factory in 1905, before the government began to adopt a policy of intervention.

After the explosion of the steamship Sultana in 1865, which saw 1,167 people killed, pressure was placed on the industry in the United States. This resulted in the creation of the American Boiler Manufacturers Association (ABMA) to start self-regulation of the industry. It took several hundreds of such explosions; and a particularly expensive one in 1905 at a Boston shoe factory for the intervention of government policies to come into being.

Interestingly, it was not the ABMA that responded to the 1905 disaster, but a group of five engineers who were members of the American Society of Mechanical Engineers, a professional organization made up of individuals rather than corporations. These men wrote the first edition of the Boiler Code, which was soon approved by the Massachusetts Legislature.

Interestingly, it was not ABMA that responded to the 1905 disaster, but a group of five engineers, members of the American Society of Mechanical Engineers, a professional organization of individuals, rather than companies. These people wrote the first version of the Boiler Code that subsequently was endorsed by the Massachusetts legislature shortly thereafter.

In many ways, these engineers, these individual volunteers "helped themselves" to solve problems; just like we do today with ASF's open source and the IETF (Internet Engineering Task Force) in developing Internet standards. It was the professional groups that solved the problem: not their employers, the industry, or the American Boiler Manufacturers Association.

In many ways, these engineers, these individual volunteers “scratched an itch” to solve the problem; much like we do today in Open Source at the ASF as well as, for example, the Internet Engineering Task Force (which sets the standards for the Internet). It was the professional community which solved the problem: Not their employers,the industry, or the ABMA.

ed631bd306785dbdbe543dcded8ab0a0.jpeg

Right now, there is a lot of legislation going on almost everywhere in the world; the US and the EU are slightly ahead (and there is also a lot of coordination among national policymakers).

There is currently a lot of legislation in process in almost all parts of the world; with the US and the European Union slightly ahead (and with plenty of coordination between the policy makers of the various countries).

In this blog post, we will only focus on one of them for the time being: the European Union's CRA - Cyber ​​Resilience Act , since this is the "first mover" from a timeline perspective.

In this blog post we’ll focus on just one for now: the Cyber Resilience Act (CRA) in the EU, as that is “first” from a timeline perspective.

This is by no means the most important piece of legislation. At ASF, we believe that the European Union's Product Liability Directive (which introduced "strict liability" regulations for software), the United States' Executive Order 14028 "Improving National Cybersecurity" and "Open Source Software 2023" SECURE Act" (US) may have an even greater impact.

It is by no means the most important piece of legislation. At the ASF we gauge the impact of the EU’s Product Liability Directive (introducing “strict-liability” to software), the US Executive Order 14028, “Improving the Nation’s Cybersecurity” and the “Securing Open Source Software Act of 2023“ (US), as perhaps having an even larger impact.

Translator's Note:

  • U.S. Executive Order 14028, "Improving National Cybersecurity:

    https://www.whitehouse.gov/briefing-room/presidential-actions/2021/05/12/executive-order-on-improving-the-nations-cybersecurity/

  • The Open Source Software Security Act of 2023" (US):

    https://www.congress.gov/bill/118th-congress/senate-bill/917/text

This may be especially true because U.S. legislation can set standards for the country through the National Institute of Standards and Technology (NIST), which often sets standards faster than the EU does (so it is likely that global standard).

This may be especially true as the US legislation could set standards for that nation through the National Institute of Standards and Technology (NIST), which is typically faster than standards developed by the EU (and thus may well set the global standard).

5e562e2de04c01361d6b74dc63a51d3b.png

In day-to-day practice, software developers rarely need to think about regulatory issues (unless they work in some specific fields, such as medical, aerospace, finance, or nuclear energy). Open source licenses (on our downstream) and committer license agreements (on our upstream) tend to have wide-ranging disclaimers. We often equate code with codified knowledge or speech.

In day-to-day practice, software developers rarely need to consider regulation (unless you work in some specific field, say medical, aerospace, finance, or nuclear). Open Source licenses (on our downstream outflow) and committer license agreements (on our instream) tend to have far-ranging disclaimers. And we often equate code to codified knowledge or speech.

But in practice, things are not that simple. For example, at ASF, we have been required to submit documents for years to let the Bureau of Industry and Security (BIS) know the exact location of the encryption code we provide for download  [https://infra.apache.org/crypto.html] . Code published by ASF cannot be exported (or re-exported) to specific destinations or to persons on specific lists.

However, in actual practice, things are not that simple. For example, here at the ASF we’ve had, over the years, the need to file some paperwork to let the Bureau of Industry and Security (BIS) in the United States know the exact location of cryptographic code that we make available for download [https://infra.apache.org/crypto.html]. And code distributed by the ASF can not be exported (or re-exported) to certain destinations or to people on a certain list.

Translator's Note: Specific destinations or persons on a specific list - Specifications for exporting ASF products: https://www.apache.org/licenses/exports/

3cafe709d1231a4646b404e9133a049e.png

In the EU, the CRA - Cyber ​​Resilience Act is currently in the legal process (with a key vote on 19 July 2023). The bill will apply to a wide range of software (as well as hardware with embedded software) in the EU. The intent of this regulation is good (and arguably overdue): to make software more secure.

In the EU the Cyber Resilience Act (CRA) is now making its way through the law-making processes (and due for a key vote on July 19, 2023). This act will apply to a wide swath of software (and hardware with embedded software) in the EU. The intent of this regulation is good (and arguably long overdue): to make software much more secure.

Translator’s Note: The European Union’s Cyber ​​Resilience Act https://www.european-cyber-resilience-act.com/ has been voted on, but has triggered many objections. The subsequent development is worth observing.

The bill attempts to achieve this goal in a number of ways. Most importantly, the CRA will require the market to adopt industry good practices for security when designing, building, releasing and maintaining software. At the most basic level, the CRA formalizes the basic current policy of the ASF: Manage Your Errors, Accept, Triage, and Fix Security Vulnerabilities. This is also achieved by combining it with good governance or practices; for example, registering CVEs (Common Vulnerability and Exposures) when appropriate, writing release notes and doing proper version management (which, to be fair, some we should further regularize and improve).

The act attempts to do this in a number of ways. The most important is that the CRA will require the market to apply industry good practices to security when designing, building, releasing, and maintaining software. At a most basic level, the CRA formalizes what is by and large already policy at the ASF: manage your bugs and accept, triage, and fix security vulnerabilities. This is also done by pairing this with good governance or practices; such as registering CVEs when appropriate, doing release notes, and decent versioning (and in fairness, some of those we should further formalize and improve).

Translator’s Note: ASF’s current basic policy

 https://www.apache.org/security/committers.html

The CRA will also try to ensure that all software on the European market achieves some minimum level of security, through fairly simple self-certification in a CE declaration of conformity. Or, for more critical software like firewalls or secure encryption key enclaves, actual "real" certification and auditing by an external, regulated, designated agency. The CRA will also define a series of processes to monitor the market for compliance.

The CRA will also attempt to ensure that any and all software in the European market meets some sort of minimum level of security by fairly simple self-certification documented in a CE conformity declaration. Or, for software that is more critical, such as a firewall or a secure cryptographic key enclave, an actual “real” certification and audit by an external, regulated, and notified body. The CRA will also define a number of processes to monitor compliance in the market.

Translator's note:

  • The "CE" mark is a product safety certification mark (that is, it is limited to basic safety requirements for products that do not endanger the safety of humans, animals, and goods, rather than general quality requirements). It is regarded as a passport for manufacturers to open and enter the European market. . CE stands for CONFORMITE EUROPEENNE.

  • Secure encryption key enclave: refers to the protected portion of the processor and memory of the hardware.

EU policymakers recognize that these "industry best practices" are not well defined (across the industry, ASF's security practices are the exception rather than the rule) - many CRAs rely on international standards Organizations set standards that people can use to audit their own projects (self-certification), or that can be used by external auditors.

EU policymakers recognize that these “industry best practices” are not yet well defined (within the industry in general, the ASF is the exception, not the rule) — and a lot of the CRA relies on international standards organizations to create the standards one can use to audit one’s project (self-certification) or that can be used by external auditors.

Additionally, there is an expectation that major vulnerabilities will receive special treatment and be reported as early as possible. More on that later.

There is also an expectation that significant vulnerabilities will get special treatment – and that these will get reported early. More on that later

e8b605392be75cc7d6e60d590359f6e1.jpeg

If you've been following the various blogs and open letters, the Open Source Foundation has been focusing on how to help refine the existing wording of the CRA so that open source software is "exempt"; that is, only when the code leaves the open source public domain . Commons can also be translated as public resources) , the CRA applies; it then continues to apply throughout the commercial supply chain. At the same time, when something (such as a security fix) enters the public domain again, the CRA - Cyber ​​Resilience Act no longer applies.

If you’ve followed the various blogs and letters, there has been a lot of focus by open source foundations to help refine the current wording of the CRA to make open source software “exempt”; i.e, have the CRA apply only when the code leaves the open source commons; and then continue to apply throughout the entire commercial supply chain. And also to stop the CRA from applying when something, e.g. a security fix, comes back and enters the commons again.

Translator's note:

  • Blog: https://blog.opensource.org/what-is-the-cyber-resilience-act-and-why-its-important-for-open-source/

  • Open letter : https://blog.opensource.org/the-ultimate-list-of-reactions-to-the-cyber-resilience-act/

In general, these open source foundation or community efforts have not been successful . The document versions of the "CRA - Cyber ​​Resilience Act" have changed significantly in previous iterations, but they do not revolve around the specific policy issues of open source foundation or community concerns mentioned above.

By and large, these efforts have not been successful. Successive versions of the documents changed considerably – but not around this specific policy issue.

To understand why, representatives of the ASF (along with OpenSSL) spoke directly to the EU on July 7th, the first time we've actually been able to meaningfully interact with lawmakers.

To understand why, representatives of the ASF (together with OpenSSL), spoke directly to the EU on July 7, the first time we actually were able to interact with lawmakers in a meaningful way.

From this conversation, we learned that policymakers are well aware that open source is critical to the IT industry, both for "production" and innovation. Because of this, they want to avoid killing the goose and picking the egg.

From this conversation, we learned that the policy makers are very aware that open source is crucial to the IT industry — both for “production” and innovation. And, because of this, they want to avoid killing the goose that lays the golden eggs.

Translator’s note: Policymakers are very aware

https://digital-strategy.ec.europa.eu/en/library/cyber-resilience-act-impact-assessment

On the other hand, EU legislators are also aware that open source often accounts for 95% or more of the software stacks operated or licensed by a typical European small and medium-sized enterprise (SME). The SME, as the party that brings it to market, is responsible for the entire software stack.

On the other hand, EU lawmakers also realize that open source is often 95% or more of the software stack on which a typical European Small and Medium Enterprise (SME) operates or is licensed. And it is that entire stack which the SME, as the party that places it on the market, is liable for.

We understand that policy makers perceive these process improvements (and (self)certification) to be costly; approximately 25% more overhead. This builds on similar regulations recently introduced in the healthcare sector and the CRA impact assessment (any EU legal proposals need to document their likely economic impact).

From what we understand, policymakers assume that these process improvements (and (self) certification) are costly; on the order of 25% more in cost overhead. This is based on recently introduced similar regulations in the medical sector and the CRA impact assessment (any EU law proposed needs to have its likely impact in economic terms documented).

So, looking at the entire stack of SMEs (i.e. 95% open source and 5% secret recipe), for most European SMEs the extra effort on all 100% code will be their engineering several times the effort and therefore not feasible. The idea in the EU is that it's much easier to certify the 5% or 10% of code they build on top of their open source stack.

So looking at the whole stack of an SME (i.e., 95% open source, 5% secret sauce), then for most European SMEs this extra effort over the full 100% would be several times their engineering effort and hence would not be feasible. Whereas, the thinking is at the EU, certifying the 5 or 10% of the code they build on top of the open source stack is a lot more achievable.

Therefore, policymakers(1) made it clear to ASF that they intend to apply the CRA to open source foundations. Currently, the exceptions to open source are either pure hobbyist code that is not used in real life, or things like mirrors and package repositories like NPM or Maven Central. Their approach is to presume commercial intent if the software is used anywhere in a commercial environment.

So, for this reason, the policymakers (1) have made it crystal clear to the ASF that they intend to have the CRA apply to open-source foundations. The current exceptions for open source are either for pure hobbyists, code that is not used in real life, or for things such as mirrors and package repositories like NPM or Maven Central. The way they do this is a presumption of commercial intent if the software is used anywhere in a commercial environment.

30a017d48114ba98bc9879eefd29d039.jpeg

A piece of legislation in the European Union is usually drafted by the European Commission (which also prepares 'impact studies', etc.). It is then discussed in Parliament. Discussions generally take place in smaller committees. These committees prepare reports and ultimately legislation is submitted to the plenary of Parliament for a vote (2).

A piece of EU legislation is generally drafted by the European Commission (who also prepares things such as impact studies). It is then discussed in Parliament. This is generally done in smaller committees. These committees prepare reports and ultimately legislation then goes to a plenary session of the parliament for voting(2).

The main committees of the CRA are LIBE, IMCO and ITRE.

For the CRA the main committees are LIBE, IMCO, and ITRE.

The first committee, the Committee on Civil Liberties, Justice and Home Affairs (LIBE), was responsible for discussing issues such as "freedom of speech", but the committee refused to submit a report. Next, the Internal Market and Consumer Protection Committee (IMCO) looked at what is important for consumers and the internal market. The committee prepared a report and submitted it to the Industry, Research and Energy Committee (ITRE).

The first, LIBE (Committee on Civil Liberties, Justice and Home Affairs) — where things such as `free speech’ are discussed — declined to produce a report. Next IMCO, the Committee on the Internal Market and Consumer Protection, looked at what is important for the consumers and the internal market. It produced a report that was fed into ITRE.

Thereafter, the Industry, Research and Energy Committee (ITRE) prepared a consensus document, which is expected to be publicly discussed and final approved by the Committee during the week of 20230717 (in the case of consent, the Committee generally does not vote) .

ITRE, the Committee on Industry, Research and Energy, has since produced a consensus document that is expected to be discussed publicly the week of 20230717 and gets its final committee endorsement (they generally do not vote on things when there is consent).

Translator's Note: 20230717 Open discussion this week

https://www.europarl.europa.eu/committees/en/itre/home/highlights

Once this is done, the proposal will go to the European Parliament for a vote. Depending on the level of controversy or consensus at the time, there may or may not be a discussion and a free vote.

Once this completes, the proposal goes through the European Parliament for voting. Depending on how controversial or consensual it is at that time, there may, or may not, be discussion and a free vote.

Translator's note: Work completed https://www.europarl.europa.eu/legislative-train/theme-a-europe-fit-for-the-digital-age/file-european-cyber-resilience-act

Meanwhile, the EU's third party - the Council - is also preparing its version of the bill. It is mainly reviewed by the relevant ministers of each country from their own national perspective. The three versions (the European Commission, Parliament and Council) will then be discussed behind closed doors in "Trialogues" before the final version becomes law.

In the meantime – the third party of the EU – the Council – also prepares its version of the Act. These are essentially the relevant ministers of each country that look at it from a national perspective. The three versions (EC, Parliament and Councils) are then discussed, behind closed doors, in the Trialogues – which then yields the final version that becomes law.

8086369a414d8b15aa11af5d6f622e74.jpeg

At present, all parties in the legislative process are said to have reached a general consensus - with two parties sharing their view with the ASF that there is no dispute. Additionally, copies of various consensus documents have been leaked - so we know they're not that far apart and now we can start analyzing them as well.

Right now all parties in the lawmaking process are said to have reached a rough consensus – and two of them shared with the ASF their opinion that there is no controversy. Also, copies of the various consensus documents have leaked – so we know that they are not far apart, and we can now also start to analyze them.

12b2851db38e14b04aa1a13ffb9dd4c0.jpeg

The current definition (3) states that the Antitrust Law applies to ASF, all of its (volunteer) developers, and all of our output. And, according to ASF's meetings with policymakers, this is intentional.

The current definitions(3) are such that the CRA applies to the ASF, all of its (volunteer) developers, and all our output. And, as the ASF understands from its meeting with policy makers, this was intentional.

There are many concerns about the CRA, but the following issues may be the most important to the ASF community.

There are quite a few concerns with the CRA; but the following are probably tops for the ASF community.

The concept of the public domain is no different from the commercial market. It is an all-in model : The first problem is that the Anti-Monopoly Law adopts an all-or-nothing dichotomy. Either join or quit. When you join, what applies to you is essentially what needs to apply to the full suite of commercial products sold to consumers.

No concept of a commons is distinct from the commercial market; it is an all-in approach: The first issue is that the CRA takes a binary all-or-nothing approach. You are either in or you are out. And when you are in – what is applied to you is, essentially, what needs to be applied to a full-blown commercial product that is sold to consumers.

While open source might be close to this (like Apache Netbeans or Apache Zeppelin, although there is no actual commercial product for sale), open source is generally not part of a commercial environment. Instead, it can be managed as shared knowledge or a common resource. Like an academic paper or reference blueprint. The CRA doesn't recognize this - so the CRA applies fully to open source software" (rather than just applying elements of the CRA that might make sense in this context - like good bug handling, version control, and "software bill of materials - SBOM").

While open source can be close to that (e.g., Apache Netbeans or Apache Zeppelin – albeit not sold) — open source generally is not part of that commercial setting. Instead, it may be managed as a piece of shared knowledge or a commons’. Much like for example academic papers or reference blueprints. The CRA does not acknowledge this – and hence applies itself in full’ (as opposed to for example just applying the elements of the CRA that could make sense in that context — such as good vulnerability handling, versioning and SBOMs).

The CRA will regulate open source projects unless they have a "fully decentralized development model . " However, projects for which "Company" employees have commit rights will not be exempt (regardless of whether the upstream open source collaboration has any or nothing to do with their employer's commercial product). Some projects, like the venerable OpenSSL project, have even more complex patterns.

The CRA would regulate open source projects unless they have “a fully decentralized development model.” However, a project where a “corporate” employee has commit rights would not be exempt (regardless, potentially, of the upstream collaboration having little or anything to do with their employer’s commercial product). And some projects, like the venerable OpenSSL project have an even more complex model.

Translator's Note: The OpenSSL project, its model is even more complex 

https://www.openssl.org/blog/blog/2023/07/17/who-writes-openssl/

This subverts the win-win principle of open source. If corporate maintainers are banned, businesses may stop having their employees maintain projects, thus harming the open source innovation ecosystem, which ironically would undermine its resilience and its huge boost to the economy/growth (according to the EU Impact Assessment, 9 billion euros per year).

This turns the win-wins of open source on its head. If corporate maintainers are banned, corporations may pull back from allowing their employees to maintain projects, harming the open source innovation ecosystem and, ironically, undermining its resilience and its significant economic/growth generator (9bn€ per year according to the EU impact assessment).

It also makes it difficult to see who in the ASF community would do the additional (self-)certification work that ASF might be asked to do.

It also makes it very hard to see who in the ASF community would do the extra (self) certification work that the ASF would need to do.

The net effect of this is actually quite broad. Take an example from "Preamble (4), 10a" (there are many such examples):

The net effect for this is actually quite broad. To give an example from the “Recitals(4), 10a” (and there are many such examples):

Likewise, if the major contributors to a free and open source project are developers employed by commercial entities, and those developers or employers have control over which modifications are accepted in the code base, the project should generally be considered commercial.

Similarly, where the main contributors to free and open-source projects are developers employed by commercial entities and when such developers or the employer can exercise control as to which modifications are accepted in the code base, the project should generally be considered to be of a commercial nature.

Here, the lack of transactional linkage between these contributors and commercial employers is a problem. For example, a developer could be a pilot employed by a commercial airline (i.e. a commercial entity) contributing to open source in his spare time: this part of the policy would make such contributions "commercial". Additionally, at ASF, major contributors (committers) certainly have some degree of control over what goes into the codebase (5).

Here the lack of a transactional connection between those contributors and the commercial employers is problematic. For example, the developer could be an airline pilot employed by a commercial airline (i.e. a commercial entity) – who contributes to open source in their spare time: this part of the policy would make that contribution ‘commercial’. Also, at the ASF, the main contributors (committers) are of course able to exercise a level of control over what goes into a codebase(5).

| Translator's note: This means that airline companies, which have nothing to do with open source, will also suffer the consequences and be included in the scope of supervision.

To make matters worse, the types of open source organizations most affected are also the ones that today tend to have very mature security processes that responsibly triage, fix, and disclose vulnerabilities and provide matching CVE (Common Vulnerabilities) and risk). Generally speaking, CRAs are required to drive significant improvements downstream, that is, at the companies that place products on the market. But now the opposite may be happening.

And what makes matters worse is that the type of open source organizations most affected are also exactly those that, today, tend to have very mature security processes, with vulnerabilities getting triaged, fixed, and disclosed responsibly with CVEs to match. While it generally is further downstream; with the companies that place the product on the market — that the CRA needs to drive significant improvement. It now risks doing the reverse.

CRA influences entirely volunteer-led and driven projects (such as ASF), where no company has any influence over the operation and release of products in these independent and autonomous ASF projects.The CRA will affect projects where employees of any commercial entity have the right to contribute (Commit).

The CRA affects projects that are entirely volunteer-led and -driven (e.g. such as at the ASF) where no one company has any influence on what the product does and releases. Any project where an employee of a commercial entity has commit rights is affected.

This brings up a problem: whether it is a commercial company or an open source project, it needs to be more careful about which committers can modify the code, which grants are accepted, and which patches are accepted.

This leads to the problem: that both commercial companies and open-source projects will need to be much more careful as to what committers can work on code, what funding they take, and what patches they can accept.

In CRA certification, there is a strong assumption that (self-)certification of modules is "transitive";That is, if you build something with certified modules, you only need to certify the "extra" things you do. Unfortunately, this is not true in general; certification is usually largely about showing how you (as the organization ultimately taking responsibility) ensure that what you deliver is suitable for you to deliver in your client's specific circumstances purpose. Open source organizations cannot provide "upstream" information when they self-certify the software modules they build.

In the certification there is the strong assumption that (self) certification of modules is transitive’; i.e. that if you build something from certified modules, you only have to certify the few extra things you have done. Unfortunately, this is not true in general; certification is generally very much about showing how, as the final, liable organization, you have made sure that what you delivered is fit for the purpose you delivered it for the specific setting at your customer. Information that is not available `upstream’ at the open source organizations that self-certified building blocks.

At its core, certification is about ensuring that published information is appropriately secure for its intended purpose. Specifically, it means designing with security in mind and planning out threat actors, vectors, and risks. Then make reasonable engineering compromises based on the risks.

The core of certification is to ascertain that what you release is suitably secure for its intended purpose. Specifically, you have done your security by design and mapped out your threat actors, vectors, and risks. And then made reasonable engineering compromises based on risk.

Unfortunately, in the open source world, we often don't know how our software will be used. And, as we have learned over the past decade, the key to good governance of our commons is that we cannot discriminate or restrict in our licenses (Translator's Note: How to use open source software ) In fact, this is part of the definition of open source).

Unfortunately in open source, we often have no idea how our software is going to be used. And, as we’ve learned (the hard way) over the past decade, it is key for the good governance of our shared commons, that we do not discriminate or otherwise limit our licenses (in fact – that is part of the open source definition).

Translator's Note: Open Source Definition is the 10 basic principles for open source license agreements formulated by OSI. A license agreement that violates these principles may not call itself an "open source" license agreement. Open source definition: https://zh.wikipedia.org/zh-cn/%E5%BC%80%E6%BA%90%E5%AE%9A%E4%B9%89

Some obligations are nearly impossible to fulfill : for example, there is an obligation to "deliver a product free of known exploitable vulnerabilities." This is an almost impossible standard to set; especially since authors of open source software have neither knowledge nor control over how their code is integrated downstream .

Some of the obligations are virtually impossible to meet:  for example, there is an obligation to “deliver a product without known exploitable vulnerabilities”. This is an almost impossible bar to set; especially as open-source authors, neither know, nor have control over, how their code is integrated downstream.

The next question has to do with standards . The CRA mentions a large number of International Standards "to be written" (generally considered to be developed by CEN-CENELEC). The IT industry in general, and open source in particular, doesn't have a very good track record of working with these standards bodies (also including ASF), in part because nearly all key Internet standards are maintained by the IETF and W3C. In fact, it is not uncommon for the charters of these standards organizations not to allow open source organizations to be members of them in a meaningful way.

The next problem is around standards. The CRA refers to a large number of `to be written’ international standards (generally assumed to be created at CEN-CENELEC). The IT industry in general, and open source in particular, does not have a great track record of working with these standard bodies — in part as almost all key internet standards (also at the ASF) are maintained at the IETF and W3C. In fact, it is not uncommon for the bylaws of these standards organizations to not allow open source organizations to be members in a meaningful way.

Translator's Note: Open source organizations are not allowed to be members in a meaningful way https://blog.opensource.org/another-issue-with-the-cyber-resilience-act-european-standards-bodies-are-inaccessible- to-open-source-projects/

The CRA requires serious unpatched and exploited vulnerabilities to be disclosed to ENISA (the European Union agency) within a time limit of hours before the vulnerability can be fixed . This goes against industry best practice of responsible disclosure of fixes and (workarounds).

The CRA requires the disclosure of serious unpatched and exploited vulnerabilities to ENISA (an EU institution) within a timeline measured in hours before they are fixed. This is opposed to what is industry best practice — responsible disclosure of the fix and workaround.

And, not only would such premature reporting be a distraction from releasing information about the fix, it would also be easy for the international community to run afoul of other countries' insistence on the same information, or worse, banning such information from being shared. This undermines the core culture of fair and impartial reporting that open source relies on.

And not only does this too-early reporting distract from getting a fix out – for international communities it is easy to run foul of other countries insisting on the same information or, worse, prohibiting such sharing. Thus breaking the very core of the fair and equitable reporting culture that open source relies on.

Moreover, this information will only be useful to ENISA if it is widely shared - so organizations should choose a prudent, globally "fair" approach and take the easy way out: make sure these issues are never heard of. Or, do the opposite and make the problem public before the (first) reporting deadline ends, i.e. before the problem has been solved.

And, as this information is only useful to ENISA when it is then widely shared — it is rational for organizations to choose the prudent, globally ‘fair’ option and take the easy path out: ensure you never hear about them. Or, the opposite, simply makes things public right before your (first) reporting deadline rolls over, i.e., before they are fixed.

Translator's note: This means that the problem will not be announced to the public until the problem is solved; or as soon as there is a problem, it will be announced to the world immediately. Rather than prioritizing issues only to certain specific agencies, such as the European Union's ENISA.

So this is yet another example of how a CRA, although well-intentioned, can end up being counterproductive.

So this is yet another example where, with all its good intentions, the CRA may end up accomplishing the exact opposite.

13c62c487eaf13eb2fab94ac3e3ce87e.jpeg

Looking at the current IT industry in Europe, we can see that the root cause of the IT industry's poor security posture is often not open source (especially from organizations like the ASF). In fact quite the opposite.

Looking at the IT industry in Europe now, one can observe that it is generally not open source (especially coming from the likes of the ASF) which is the root cause for the sorry state of security in the IT industry. Quite the contrary.

In contrast, most small and medium-sized businesses in Europe rarely update the systems they rely on and are generally not good at handling security issue reports. And ASF's (regular) updates bring more (re)certification work for them, possibly making them slower to accept our updates and security fixes.

While, in contrast, most SMEs in Europe rarely update their dependencies and are generally not well-versed in dealing with security issue reports. And (regular) updates at the ASF creating even more (re)certification work for them may make them even slower to pick up on our updates and security fixes.

However, there are many possible approaches within CRA that we know are likely to be effective; this is also true at the level of open source organizations such as ASF.

However, there is also a lot in the CRA that is feasible, and where we know that it is likely going to be effective; also at the level of open source organizations such as the ASF.

In fact, we already do most of it today, such as good triage of vulnerability reports, responsible disclosure, registration of CVEs (Common Vulnerabilities and Exposures), and careful use of version numbers. On top of that, we also have good governance in place, with projects reporting to the board and occasionally projects being moved to the attic when the time comes. ) .

In fact, we do most of this already today, such as good triage of vulnerability reports, responsible disclosure, registering CVEs, and being careful with version numbers. And to this we apply good governance, with board reporting by the projects and the occasional project that gets moved to the Attic when their time has come.

Even more problematic is that the CRA also imposes a set of requirements that either threaten open source contributions or the very fragile "win-win" situation of our public (or shared) domain, go against industry good practice, or are simply impossible to achieve, That is, it attempts to treat the open source public domain as equivalent to the commercial domain.

The problem is more that the CRA also piles on a whole range of requirements that are either threatening the very fragile “win-win” of open source contributions or our commons, which go against industry good practices or are downright impossible, i.e. it tries to treat the open source commons identical to the commercial sector.

In fact, the United States appears to be aware of this and is working with the National Institute of Standards and Technology (NIST) to document these existing good practices with industry.

In fact, the USA appears to realize this and is taking the path with NIST to work with the industry to document these existing good practices.

In some ways, the United States appears to be closer to the historically engineer- and individual-led ACME program that produced boiler codes; while the EU appears to be more interested in asking manufacturers rather than experts.

And to some extent – it appears that the US is closer to the historical engineer and individual-led ACME process that produced the Boiler code, while the EU seems to be more on the path of asking manufacturers, rather than experts.

969192f22b952ff1d5418d9215832079.jpeg

"The Internet sees the well-oiled mechanism of censorship (as if the elephant in the room really existed) as a malfunction, and works around it" (John Perry Barlow).

There is of course an elephant in the room: the well-oiled mechanism that “The internet treats censorship as a malfunction and routes around it” (John Perry Barlow).

Translator's Note: There is also an elephant in the room: meaning "a problem that is too big or troublesome for anyone to touch"

We saw this mechanism in action in the 1990s when the United States attempted to regulate encryption software. Only encryption software technology that "meets export regulatory requirements" can leave the United States. This caused a large number of crypto industries and personnel to physically and legally leave the United States, and moved the industry from the United States to Europe. There, companies just need to import their codes into the US, or ship them around the world from Europe, without being subject to the rules of the US Export Administration (BXA: The Bureau of Export Administration). It took more than two decades for this to normalize (we still see remnants of this in ASF).

We saw that mechanism come into action in the 90s when the USA tried to regulate cryptographic software. And only “export strength” cryptography could leave the US. That led to a lot of cryptographic industry and staff leaving the US, physically and legally; and a move of that industry from the USA to Europe. From there the companies would then simply import their code back into the USA or ship it from Europe, unencumbered by USA BXA rules, to the rest of the world. It took over two decades for this to normalize (and we still have the vestiges of that at the ASF)

|  Translator's Note: Remnants of this can still be seen in ASF https://www.apache.org/licenses/exports/

Therefore, ASF also needs to consider the risk that our community may be divided by the CRA. Especially when our ASF project community scattered across European countries cannot mobilize enough capacity and strength to implement CRA on ASF projects.

So, as the ASF, we also need to factor in the risk that our communities may split on the CRA. Especially if our European communities are not able to muster enough capacity and capability to implement the CRA at the ASF.

301f6e8703c8ad8dec7153bc29d70293.jpeg

The Industry, Research and Energy Council (ITRE) vote will be held during the week of July 17, 2023. This is a parliamentary committee that advises members of the European Parliament on how to vote. After the vote, tripartite discussions (Trialogues: European Parliament, Council of the European Union, European Commission) are likely to begin after the summer 2023 recess. If the Big Three reach a consensus, as appears to be the case so far, the process could end as early as December.

The week of July 17, 2023 will see the ITRE vote. This is the parliamentary committee that recommends to the Members of the European Parliament how to vote. Once that is done, the Trialogues will likely start after the Summer 2023 recess. If the consensus between the three powers holders (as they appear for now) – this process may conclude as early as December.

Therefore, within a very short time one can contact MEPs from the Ministry of Industry, Research and Environment. Generally speaking, if the messages are polite, sent by a party with a certain political or economic status (such as the CEO of an SME organization), and are consistent with the local context, such as in the local language to the parliamentarians of the country, It can also be helpful to note the political stance of the party they represent. Since the regulation of open source is intentional, and there are many common sense good (open source) practices in the CRA: Cyber ​​Resilience Act: the expectation is that we (the open source community) have achieved something and passed the requirements The phase of the blanket exception.

So, in the very short term, one can reach out to the MEPs of ITRE. It generally helps if these messages are polite, sent by a party with some political or economic standing (e.g. the CEO, a SME organization) and are tuned to your local setting, such as to a parliamentarian of your own country in your own local language, and mindful of the political position of the party they represent. As the regulation of open source is intentional, and there are also a lot of common sense, good (open source) practices, in the CRA: the expectation is that we are past the point where asking for a blanket exception is productive.

Translator's Note: Member of the European Parliament for Industry, Research and Environment

https://www.europarl.europa.eu/committees/en/itre/home/members

ASF will focus on the EU Council version (as its text usually "wins" in tripartite discussions and is now better than ITRE's consensus text). To do this, we need your help: in particular, if you can help us engage executives from your country's larger SMEs and are willing to explain at the national level the negative impacts that the CRA will have (please contact the ASF Deputy Public Affairs President; dirkx@apache@org) .

At the ASF we expect to focus on the Council version (as its text generally `wins’ and right now is a bit better than the ITRE consensus text). For this we can use your help: in particular, if you can help us get the executives of larger SMEs in your country engaged and willing to explain the impact at a national level (just contact the VP of Public Affairs; dirkx@apache@org).

Translator's Note: The appeal of the last few paragraphs seems rather cryptic, but it actually means "open source has not been successful, comrades still need to work hard"!

Reprinted from | Open Source Rainforest

Editor | Wang Jun

Related Reading| Related Reading

Bobo's CommunityOverCode Asia 2023 organization/participation impression
COSCon'23 open source market: go to an open source party on the lawn

Introduction to Kaiyuan Society

Kaiyuan Society was founded in 2014. It is composed of individual members who volunteer to contribute to the open source cause. It is formed based on the principles of "contribution, consensus, and co-governance". It always maintains the characteristics of vendor neutrality, public welfare, and non-profit. It is the first to use "open source governance, International integration, community development, project incubation" is an open source community federation with the mission. Kaiyuan Society actively cooperates closely with communities, enterprises and government-related units that support open source. With the vision of "based on China and contributing to the world", it aims to create a healthy and sustainable open source ecosystem and promote the Chinese open source community to become an active player in the global open source system. Participants and Contributors.

In 2017, the Open Source Society transformed to be composed entirely of individual members, operating in accordance with the governance model of top international open source foundations such as ASF. In the past nine years, it has connected tens of thousands of open source people, gathered thousands of community members and volunteers, hundreds of lecturers at home and abroad, and cooperated with hundreds of sponsors, media, and community partners.

5020dd5117fe7ebe7fc90aa9bd4b4bd9.gif

Guess you like

Origin blog.csdn.net/kaiyuanshe/article/details/132658305