[Secure-testing-team] Proposal: new tags
Florian Weimer
fw at deneb.enyo.de
Fri Sep 16 18:39:13 UTC 2005
* Joey Hess:
> Florian Weimer wrote:
>> REJECTED, RESERVED, NOT-FOR-US replace the corresponding "NOTE:"
>> variants. Parsing the old tags is rather fragile because NOTE: is
>> essentially a free-form field, so we often miss spelling errors. (The
>> old tags remain valid, though -- there is no need to replace them at
>> this point.)
>
> Good idea on rejected and reserved.
Okay, I'll implement something like this.
> Not sure about not-for-us, part of the resaon we put the name of the
> software in parens is to aid finding bugs in software if it does end
> up entering Debian later on.
Let's use "NOT-FOR-US: reason" then. I want to get rid of "NOTE:
not-for-us" because it's quite hard to detect misspelled variants. It
doesn't matter much at the moment because the field isn't parsed
mechanically.
> This information can be hard to get from CAN descriptions otherwise.
I think the iCAT database offers normalized product names and CVE
cross-references.
> Also to record what software name we checked for in Debian, in case
> it turns out we didn't look for the right thing or something like
> that. So I think it's worthwhile to continue including that
> information in not-for-us.
Agreed.
>> "INVALID" means that the bug report is known to be false. For
>> example:
>>
>> CVE-2003-0024
>> INVALID
>> NOTE: I have mailed Goran Weinholt <weinholt at debian.org> about this.
>> NOTE: Goran Weinholt <weinholt at debian.org> tell me that aterm 0.4.2 was
>> NOTE: never vulnerable to the problem described.
>> NOTE: this CVE is bogus.
>
> Not sure how this is better than just the NOTEs by themselves.
I used to treat bugs without explicit resolution as to-do items. But
you are right, this is only necessary when bootstrapping annotations.
You currently handle this by automatically adding TODO, I think. I
could do something similar if I really want to go ahead and gain some
insight into sarge's security status.
> What's the value in having this be machine parseable?
We could generate a list of such issues. But it's more like QA for
CVE, and not really related to our task.
More information about the Secure-testing-team
mailing list