Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't know cryptocat or its authors, so I have no idea why you consider them immoral. What's the story?

Obligations are a two-way street, and good ethics should have support. If you have the means to reward disclosure of a vuln you should announce a bug bounty.

Professions have ethical standards. Some are stronger than others. They are meant to impose a basic level of morality. In the real world, that never happens perfectly. But some of them definitely imply disclosing one's work without extracting every last penny from it, such as disclosing abandoned clinical trials.



I feel about Cryptocat the way you would probably feel about someone who set up an inner-city neurosurgery clinic after reading a bunch of Usenet HOWTO posts.

I think there are two separable arguments here. We may disagree on both of them. But:

* The first argument is whether it's OK for researchers to stockpile vulnerabilities --- to learn things about software and then not share them. This might seem like an artificial distinction, but there are lots of good researchers who back-pocket great, important vulnerabilities. They don't exploit them, they don't sell them, they just find them, make some notes, and move on.

* The second argument is whether it's ok for anyone to weaponize vulnerabilities. If you believe that the USG has an obligation to disclose vulnerabilities, you're almost (but not quite) required to believe they can't do exploit development work --- for any reason. Disclosing vulnerabilities to vendors kills exploits.

I'm OK with researchers stockpiling. I'm OK with the USG weaponizing. I'm OK with the latter in the same sense as I'm OK with them carrying firearms or breaking down doors to serve warrants or freezing bank accounts. Obviously, I'm not OK when the USG abuses those powers.


Inn the abstract is seems OK for researchers to simply sit on vulns they have found, but is that what really happens? Why do that? Do they get sold eventually? Are there a lot of cases where the developer is hostile to fixing them? How OK this is depends on the eventual disposition.

The other one seems clearer: "Disclosing vulnerabilities to vendors kills exploits." Well, yes. The problem is that, in the present situation, endpoint security is terrible. It seems unlikely that our government has made it possible for themselves to break endpoint security, but not the Chinese or any other nation, organized crime group, or other non-state actor with some software smarts. It may take some catastrophic infrastructure penetration or super-Snowden leak to show why this is unwise.


Yes. Vendors are usually hostile to researchers, and vendors generally do feel entitled to researcher work-product. Their feeling is, it's their code, so they're entitled to know about problems with it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: