Why counting vulnerabilities is not a sufficient method of comparing product security

A lot of people find themselves in the position of trying to figure out which software package is the most secure, or at least more secure between a field of choices. They often try to do this by comparing the number of vulnerabilities in the two packages, going to vulnerability databases like MITRE-CVE or NIST-NVD.

However, consider this example timeline of vulnerability disclosure from a sample issue on full disclosure

2013.05.11 Vulnerability reported to the vendor
2013.05.12 Vendor asked for details
2013.05.12 Details and exploit provided to the vendor
2013.05.30 Asked vendor about the status of investigation (no response)
2013.06.11 Sent another mail to the vendor (no response)
2013.06.15 Full disclosure

That is actually a real problem that could expose passwords and private key of a site. It affects the core software. And even though the researcher followed responsible disclosure, apparently the security team didn't do anything about the issue after a month. Perhaps the worst part of all this...there's a good chance this vulnerability will never make it into a database like NVD or CVE.

Consider this example the next time someone talks about numbers of vulnerabilities. Remind them that it's more important that there is an active group of people who are:

  1. Looking for vulnerabilities (these people don't have to be part of the product's insider community)
  2. Fixing them

In this case the active user base of the product has led a researcher to find the problem. But then the product developers didn't follow through on fixing the issue and therein comes the problem: users are vulnerable and unaware of the problem.