Published on
Most companies have an Incident Response Plan these days. With an increasing number of data breaches, having a plan in place is important so that all stakeholders (Security, Public Relations [PR], Engineering, etc.) are on the same page and have a course of action when a breach or incident occurs. It leads to better outcomes, faster response times, and coordinated messaging.
But what about a Vulnerability Disclosure Response Plan?
Most companies do not have a Vulnerability Disclosure Response Plan in place, yet mishandling a vulnerability disclosure can cause PR, security, and ultimately business consequences, similar to a security incident.
Zoom, a video conferencing company, handled a recent vulnerability disclosure in a way that affected their brand and possibly their bottom line. A security researcher, Jonathan Leitschuh, discovered a vulnerability in Zoom where a victim’s video camera could be turned on and shared when visiting an attacker’s website. Leitschuh gave Zoom 90 days to address the vulnerability before publishing the details publicly but Zoom failed to fix it in time. It took Zoom 10 days to confirm the vulnerability, 3 days is more customary. In addition, it took 72 days for Zoom to schedule the first meeting to discuss potential fixes for the issue -this did not give Zoom much time to come up with a plan of action and a fix.
Zoom’s reactionary response left many customers angry, frustrated, and threatening to stop using the video conferencing service. Having a coordinated vulnerability disclosure response plan in place could have helped Zoom avoid the negative PR and public backlash that occurred.
All companies should have a plan in place whether they have a bug bounty program or not. Some researchers do not want to participate in bug bounties for various reasons, which was the case with Leitschuh.
Vulnerability Disclosure Basics
Most security researchers are looking for vulnerabilities out of a genuine interest in technology, understanding how things work, making money from bug bounties, or wanting software to be safer. Understanding the security researcher’s motives can help you become more empathetic towards them. Dismissing a security researcher is not just rude, but insulting given the time and energy they put into looking at your products to help them become more secure.
Some security researchers are looking for handouts, such as t-shirts, certificates, or cash. This leads some researchers to submit low priority vulnerabilities or false positives which can cause security teams to get frustrated with researchers. It can be difficult to work with every researcher but it is important to do so to ensure better outcomes. Also, security teams need to avoid the “Boy who cried wolf” mentality and dismiss the next researcher because they have received many false positives in the past. It is easy for overworked security teams to ignore vulnerability disclosures, but this must be avoided.
It is widely accepted that a vulnerability will be disclosed publicly by a security researcher after 90 days of disclosing it to the company whether or not it has been patched. There is some debate about whether this is a good thing or not, but either way this is the reality and your company should operate and make decisions based on this reality. Under no circumstances should you threaten to sue if someone discloses a vulnerability - this is counter productive, does not scare most security researchers, and will not make you look good. Please note that this is not legal advice and we are not lawyers -please seek an attorney for legal advice.
Vulnerability Disclosure Plan
Having a Vulnerability Disclosure Response Plan in place will help ensure that your company is ready for vulnerability disclosures, you will benefit by building a stronger relationship with the security community, and it will help avoid any negative consequences.
Always treat the security researcher with respect, do not tell them they are wrong, and do not let non-security people make a determination about the vulnerability. Give the researcher the benefit of the doubt. Sometimes vulnerabilities can be complex and what is seen as a feature can be manipulated in an unintended way to abuse the application. Sometimes this can be nuanced and having a security person review the report before responding is important.
Your Vulnerability Disclosure Response Plan should have most of the elements outlined below in order to be effective. The most important thing is to engage with the security researcher early and often and try to understand the vulnerability from their point of view.
The first step is to have a Vulnerability Disclosure Policy in place that is externally available on your website and outlines how you handle vulnerabilities. That way you lay the groundwork for how you will engage with security researchers, what you expect from them, and what they should expect from you. Managing expectations is a big part of the process. You can view a sample Vulnerability Disclosure Policy at https://hackedu.io/disclosure .
The second step is to create an internal Vulnerability Disclosure Response Plan, circulate it, and gain buy in from relevant stakeholders. You may want to involve Security, PR, Legal, Engineering, Customer Support, and other relevant teams.
Below is a list of steps that should be included in your Vulnerability Disclosure Response Plan. They include the steps that you should take once a vulnerability is disclosed to your organization. You can modify some of the timelines, but they should be close to what is suggested. The steps you should take are:
- Confirm report submission within 72 hours and communicate the course of action with the researcher.
- Within 14 days have a security engineer review and determine the remediation plan. If it is determined to be a false positive or not an issue have a second security engineer review and determine if it is indeed a false positive.
- Within 30 days update the security researcher to let them know the remediation plan. If the plan is not to remediate within the 90 days because it is a false positive or a low priority explain the decision making to the researcher. Allow the researcher to setup a call to discuss further if they disagree with the recommended path forward. Include all relevant stakeholders on the call that may have an opinion or different take on the vulnerability such as Engineering, Customer Success, PR, Compliance, Privacy, etc.
- A fix should be released and sent to the researcher for verification within 75 days to allow time for them to verify that it has been fixed. If it has not been fixed correctly it is reasonable to ask for an additional 2 weeks to remediate the issue if needed. If communication has been consistent with the researcher most will agree.
- If a fix has been deployed then disseminate an advisory to all customers and users to ensure they mitigate the risk (e.g. upgrade to the latest version).
- If the ultimate determination is to not fix the issue and the researcher still disagrees then develop a PR strategy. If the researcher discloses the vulnerability your company should be ready with a thoughtful response. PR should work closely with the security team to ensure the response includes the necessary technical details. If the vulnerability is released and customers become concerned, it is important to listen to them and not immediately react. However, it is important to act quickly to address the concerns.
If you would like additional details on building a plan and the vulnerability disclosure process, there is a standard from ISO available: ISO/IEC 29147:2018 Information Technology - Security Techniques - Vulnerability Disclosure.
Conclusion
Sometimes it can feel like a waste of time chasing down false positives and working with external security researchers. However, the time is worth it for legitimate vulnerabilities that are disclosed and fixed which improve your product. Having a plan in place can improve your process and speed for dealing with Vulnerability Disclosures as well as improve outcomes for the researcher, your security team, and your organization.