Data Integrity in Vulnerability Management

By Jess White, Deepwatch Vulnerability Management Engineer III

Estimated Reading Time: 4 minutes

A risk-based vulnerability management program is far more intertwined with other cybersecurity and IT programs than is commonly appreciated. The inputs to this program include a wide variety of data from across both cybersecurity and IT disciplines including:

  • Network architecture diagrams 
  • Firewall rules 
  • On-premises and cloud asset inventories
  • Regulatory or contractual obligations 
  • Freeze windows
  • Identification of “fragile” infrastructure that cannot be robustly scanned


Conversely, the results of vulnerability scans and assessments are then used to direct patch and configuration management teams, increase contextual knowledge about the threat landscape for SOC analysts, and paint a picture of cyber risk for executive leadership to make informed decisions.

Since these processes are so interdependent, it’s critically important to ensure that, to the extent practicable, this crucial data is inclusive, authoritative, and up to date. Why is this so important? Examples of issues that can arise from invalid vulnerability data include:

“Continuous vulnerability scanning reduces security gaps and increases visibility to deliver rich threat data and remediation advice to minimize cyber risk to an organization.”

Tou Chang, Deepwatch Vulnerability Management Engineer
  • Stale vulnerability data: reports can contain data that is no longer relevant to the intended audience, which translates to lost time and resources. Additionally, organizations using products such as Tenable Lumin or Qualys VMDR with TruRisk could have the added consequence of this stale data being factored into overall cyber risk calculations.
  • Incomplete asset inventories: receiving incomplete information for vulnerability scan and assessment targets creates a visibility risk. While vulnerability tools such as Tenable and Qualys commonly perform host discovery scanning, these tools are limited by the information supplied to them via their discovery scanning configurations.
  • Outdated reporting or tagging criteria: changes are made in the target environment without those updates being reflected in reports which causes data to be reported to a potentially unintended audience, or not reported on at all.

To make vulnerability management tool administration as hands-free as possible, I recommend the following:

  • Perform an in-depth review of your environment. Tenable and Qualys both offer health checks to their respective customers, which typically involves recommendations to optimize your platform and vulnerability management program. Additionally, your Deepwatch engineer can help perform an in-depth review of your vulnerability management tools and make recommendations on how to best move forward.
  • Get creative with dynamic tagging. Let’s face it – enterprise environments aren’t textbook. If common tags such as hostname or IP subnet don’t paint a complete picture of your organization’s divisions, looking at tags based on operating system or installed software may make more sense. Additionally, vulnerability management tools tend to have strengths and weaknesses in their tagging engines – Qualys allows you to tag by detection which can allow for unique tagging based on AD OUs, whereas Tenable allows you to tag based off a tag, which is very helpful for larger organizations requiring reporting for smaller business units.

“Accurately identifying or tagging assets directly correlates to the state of Asset Management programs.”

Kenny Williams, Deepwatch Vulnerability Management Engineer

Leverage VM tool APIs to extend reporting capabilities. Be it for log enrichment in a SIEM, data visualization in Tableau, creating issues for action in Jira, or writing some custom visualization with Python and R, vulnerability management tool APIs can help meet unique organizational requirements.

Given the foundational nature of VM data, and how intertwined it is in other cybersecurity and IT programs and processes, it is worth taking the time and effort to ensure the data is right. An ounce of prevention is worth a pound of cure, and focusing on these steps will help lay the foundation for a dynamically-updated, low-touch vulnerability management platform.

Jess White, Deepwatch Vulnerability Management Engineer III

Jess White has nearly a decade of total IT experience supporting industries including healthcare, education, retail, insurance, logistics, and finance. She has spent five of these years specializing in IT risk management including vulnerability management.

Read Posts

Share

LinkedIn Twitter YouTube

Subscribe to the Deepwatch Insights Blog