Ninety days

The 90-day standard is a number that has a human being inside it. This is what the standard looks like from the researcher’s side of the timeline — the acknowledgements, the silences, the patches, and the occasional two-word reply that makes the whole thing worthwhile.


Google Project Zero published its 90-day disclosure policy in 2014 and the industry broadly adopted it over the following years. The logic is sound: 90 days gives a vendor time to understand, triage, and fix a reported vulnerability; it gives the researcher a clear commitment about when they can publish; and it gives users an expectation that things will get fixed in a reasonable timeframe rather than sitting quietly in someone's backlog for years.

The policy is a good one. I support it. I have also lived inside it, from the researcher's side, enough times to have views about what it actually feels like — which is rather different from what the clean, principled framing of the standard suggests.

Day one: the acknowledgement

You file the report. You receive an acknowledgement. The acknowledgement says something like: “Thank you for contacting us. We have received your report and will review it.” It does not say who will review it, when, or what review means in practice. It is automated, or it reads as automated. It is, in all likelihood, the last communication you will receive for some weeks.

This is fine. This is normal. Triage at scale is genuinely hard, and the acknowledgement is honest about what it is: a receipt, not an assessment. Managing your expectations at day one means understanding that you have successfully delivered a piece of mail, not that anyone has read it yet.

Day thirty: the first silence

Thirty days in, if you have heard nothing beyond the initial acknowledgement, you are in the first silence. This is where the discipline of the 90-day standard does its psychological work. Your instinct is to follow up. The follow-up rarely produces useful information; it just adds noise to the vendor's queue and signals impatience that, in my experience, does not accelerate anything.

Wait. Read the report again. Check that the reproducer still works on the current release. If the vendor has shipped an update in the intervening weeks, verify whether the issue is still present. Doing the work of keeping your report current is a better use of the thirty-day silence than a “just checking in” email.

The taxonomy of vendor responses

Over the course of several years and multiple vendors, I have observed that responses fall into a small number of categories, each of which implies a different next step.

The rapid fix: the report is well-received, the fix is implemented quickly, a security advisory is published, you are credited. This is the ideal case. It happens more often than the prevailing narrative suggests, particularly with smaller projects and with vendors who have mature security programmes. OpenBSD, in my experience, is exemplary: the response to my kern_unveil and exec_elf reports was measured in days, and the “ok beck” and “ok guenther” sign-offs arrived with a clarity that I found genuinely refreshing. Two words. Unambiguous. Done.

The scheduled fix: the issue is acknowledged, assessed, and placed in a release queue. You receive a target date, or a version, or a quiet indication that it will be addressed. The communication is often minimal, but there is enough signal to know that something is happening. This requires patience but not anxiety.

The disputed assessment: the vendor does not agree with your severity assessment, or disputes whether the issue constitutes a vulnerability in their threat model. This is a legitimate disagreement and should be treated as such. Provide the evidence clearly, accept the vendor’s view if they can justify it, and publish your own assessment if you ultimately disagree — calmly, with the evidence, without theatre.

The closure without fix: the issue is closed without a patch, sometimes without explanation. This is the hardest case. The 90-day standard gives you the right to publish when the vendor has had reasonable time. Doing so constructively — accurately, without vindictiveness, giving appropriate credit to the programme overall — is the professional response.

The disclosure timeline speaks for itself. A researcher who files at day one and publishes at day ninety-one has acted professionally. No demands are needed. No drama. The number does the work.

What the standard is really for

The 90-day standard is sometimes framed as pressure on vendors — a deadline that forces action. I think this framing misses the more important thing it does, which is give researchers permission to publish without feeling as if they are betraying a relationship. The relationship with the vendor is important. The relationship with the users who need to know about the risk is also important. The standard mediates between those two interests by making the terms explicit in advance, before anyone has filed anything.

Both parties know the rules before the game starts. That is what makes it work when it works well.


The PING goes out on day one. Sometimes the echo comes back in days — crisp, clear, two words. Sometimes it takes the full ninety. Sometimes it does not come back in the form you expected. The discipline is in sending the next one with the same care as the first, regardless of what the last one returned.

Ninety days is a number. The practice inside it is what matters.