Skip to content
Alex Dow Aug 31, 2021 4 min read

The "Official" Definition of Critical Software in 5

We could easily say that 2021 is the year the world realized the importance of good software when it comes to cyber security. In the last 12 months we have seen a large portion of the United States experience energy shortages, we have seen 1,500 business ransomed because of their supplier’s supplier having vulnerability in their software and let’s not forget the supply chain breach that affected 18,000 companies including every corner of the US government.

Needless to say the US government has become frustrated with the status quo of insecure software in their supply chain and on May 12th 2021 released an Executive Order to “improve the United States’ posture on cyber security”. Within the Executive Order the National Institute of Standards and Technology (NIST) was mandated to among other things define “critical software” in an effort to help buyers (the government) and suppliers (the industry) better understand where to focus efforts on securing software.

This type of initiative will increase visibility of the problem and initiate change critical to our continued adoption and reliance, as a society, on information systems. However, it will be a journey to bring secure coding standards and practices to decades of code and the existing developer culture. And while this definition is for US government use for now, most commercial organizations one way or the other will adopt these definitions and the eventual produced secure coding standards and practices by NIST.

However, as I was reading through the current deliverables from NIST, in particular the definitions of “critical software” I got uneasy. Let’s dive in.

What is NIST’s definition of “critical Software”?

NIST has outlined that critical software will have one or more of the following attributes:

  • Software that is designed to run with elevated privilege or manage privileges

  • Software that has direct or privileged access to network or computer resources

  • Software that is designed to control access to data or operational technologies

  • Software that performs a function critical to trust

  • Software that operates outside of normal trust boundaries with privileged access

So All Software?! This is Awkward.

Perhaps you can now understand why I got a little uneasy; the above attributes don’t describe a subset of software that is deemed critical and thus needs to be coded better than most, rather it describes that all software as critical. While I am all for raising the water level with application security, labeling all software as critical is detrimental to actually moving the need forward and improving security within development teams. This is because no company has an unlimited budget to secure all the things and must prioritize what to protect first.

When Everything is Critical, Nothing is Critical

I have worked with several companies over the years and a commonality for companies who struggle with maturing their cyber security posture is a lack of risk management fundamentals. In the context of this subject, companies who don’t know what systems and data are critical to business operations or worse companies who believe all systems and data are critical, remain in a perpetual low maturity reactive state.

So is this NIST Definition and Future Standard Useless?

It is important to put the published definitions and eventual standard into context, particularly who the intended audience is: The US Government and their suppliers. The target audiences are under constant attack by highly sophisticated threat actors and a vast majority of the systems and data they own are incredibly sensitive and in many cases a matter of national security.

I believe the existence of the definitions and eventual standards are a great move in the right direction, but the broadness of the definitions in their current form will cause confusion on what to remediate and when. I hope when the standard comes out there will be guidance on prioritizing based on criticality and not just category.

So What Are We to Do?

Over the past decade or so we have seen a steady increase in businesses developing software in-house rather than buying it off the shelf. This has equated to more and more business critical systems and data moving from behind the walled garden of firewalls and VPN and out and on to the Internet, being protected by nothing more than a web application. Hackers have taken note and are adjusting their attack patterns to target web apps and DevOps tool chains and the data from annual reports such as Verizon’s DBIR Report proves such trends.

The tide for application security is rising and as they say, a rising tide lifts all boats. While the NIST definition and eventual standard will come into force with a focus on US government systems and supply chains for the immediate, it is quite likely that the private sector will adopt some or all of these standards as well in an effort to reduce the risk of software development.

For now the definition and drafts of standards are great bellwethers for the future of secure software development and understanding the threats these standards are trying to address is a great first step in increasing awareness. While applying such high assurance standards may not be required across the board, understanding the standards and knowing when and how to apply them will be critical to developer’s career growth into the future.