Forgot your password?
typodupeerror
Encryption Government Security BSD

FBI Alleged To Have Backdoored OpenBSD's IPSEC Stack 536

Posted by kdawson
from the all-your-vpn dept.
Aggrajag and Mortimer.CA, among others, wrote to inform us that Theo de Raadt has made public an email sent to him by Gregory Perry, who worked on the OpenBSD crypto framework a decade ago. The claim is that the FBI paid contractors to insert backdoors into OpenBSD's IPSEC stack. Mr. Perry is coming forward now that his NDA with the FBI has expired. The code was originally added ten years ago, and over that time has changed quite a bit, "so it is unclear what the true impact of these allegations are" says Mr. de Raadt. He added: "Since we had the first IPSEC stack available for free, large parts of the code are now found in many other projects/products." (Freeswan and Openswan are not based on this code.)
This discussion has been archived. No new comments can be posted.

FBI Alleged To Have Backdoored OpenBSD's IPSEC Stack

Comments Filter:
  • But but but (Score:5, Insightful)

    by igreaterthanu (1942456) * on Tuesday December 14, 2010 @08:41PM (#34555102)

    Many eyes makes FOSS software invulnerable to this sort of attack?

    Not trying to troll here, but seriously people should be doing more audits, especially themselves.

    If this has been there for ten years, then this is ten years too late in spotting it.

  • by brunes69 (86786) <slashdot AT keirstead DOT org> on Tuesday December 14, 2010 @08:43PM (#34555138) Homepage

    Why engage in mass speculation? Check out the code from the time period in question and audit it for a back door. I don't know why everyone should get up in arms over an allegation that may very well be unfounded.

  • Re:But but but (Score:5, Insightful)

    by MichaelSmith (789609) on Tuesday December 14, 2010 @08:46PM (#34555174) Homepage Journal

    I doubt the situation would be any better if OpenBSD had been commercial and closed source. Who's to say the same back door isn't in Tru64, HP-UX and AIX?

  • Re:But but but (Score:3, Insightful)

    by igreaterthanu (1942456) * on Tuesday December 14, 2010 @08:53PM (#34555246)

    Commercial is different though, with FOSS I and (everyone else should for that matter), expect that there are no backdoors and it does exactly what it says it does.

    That is supposed to be one of the biggest "selling points" of FOSS.

  • Not likely (Score:4, Insightful)

    by Anonymous Coward on Tuesday December 14, 2010 @08:55PM (#34555260)

    It would be the NSA doing this and they wouldn't require a NDA that would expire. Such an agreement would be that it never would be revealed. Sounds like a hoax.

  • Could be hard (Score:5, Insightful)

    by Sycraft-fu (314770) on Tuesday December 14, 2010 @08:55PM (#34555262)

    You have to remember that something like that wouldn't be in the code with a /*evil shit goes here*/ before it. To have survived it would need to be well hidden. The idea that you can just look at code and find problems is false. I mean were that the case, no software would ever have any bugs.

    So to find it could take a lot of work, even when you know there is something to look for.

    This presumes, of course, there IS something to look for and this isn't just some guy making shit up. I'm leaning more towards that option since I don't see why the FBI wouldn't have a longer NDA. Classified material is generally done for 50 years, and something like that would surely be classified.

  • by Anonymous Coward on Tuesday December 14, 2010 @08:57PM (#34555286)

    If the backdoor was done well, it may be impossible to confirm. Not that this is how it was done, but many encryption routines define lots and lots of constants. Random large primes and that sort of thing. You could assume that these constants were chosen for cryptographically sound reasons, and you might be right. You could also assume that these constants were created using an external "secret key", and that anyone with this secret key would be able to decrypt data, and you might be right. Or maybe it's just designed to look like a programming error i.e. if(uid=0) { ... }. Plausible deniability is the name of the game; we may be able to fix the problem by re-writing the code from scratch, but we may never be able to say whether there was a problem in the orginal code to begin with.

  • Re:But but but (Score:5, Insightful)

    by Sycraft-fu (314770) on Tuesday December 14, 2010 @09:01PM (#34555338)

    Actually it would likely be harder. In the case of OSS, all you have to do is get people to contribute to the code. The FBI doesn't really have to be sneaky about it at all, other than that the people don't reveal who they work for. They could even lie about who they are as it is all done over the net anyhow. If it gets discovered, well no big deal really. I mean it is free and open, nobody made them accept those contributions. There's no legal problems that I can see.

    In the case of a company, you have to either subvert or plant employees there. Doing that without a court order would be illegal. It also has to go on undetected, of course, and that is much harder since the employee works physically at the company. Then there's the problem that if it becomes known, you may have a lawsuit on your hands, or congressional inquiry, and so on. Big companies wield a lot of power and would likely not be amused in the slightest.

    However what the GP is really saying overall is that if this turns out to be true (please note I am doubtful of that) it shows a weakness in the "many eyes" idea. That mantra is repeated over and over by OSS advocates almost like an incantation, that because something is open it means that all sorts of people are looking it over and there won't be anything evil in it. That is not the case, of course. Some OSS stuff is well audited, some is not. If this proves to be true it would show that even the pretty well audited stuff is not immune, that just having the source out in the open is not enough to guarantee security.

  • by rtfa-troll (1340807) on Tuesday December 14, 2010 @09:01PM (#34555342)
    So; this is going to be interesting. Imagine there were no back doors; how would you prove it? Want to discredit OpenBSD; that's how you would do it. Assume there are backdoors; now we have the first known clear example of illegally placed malware by a US Govt. group. The FBI is not the NSA, but they definitely have access to good people. Assume this was rogue players. Warrentless wiretapping against US Govt. lawyers! In the absence of any pointer to relevant code, I would go with it being FUD, but I expect to be proved wrong..
  • Re:But but but (Score:5, Insightful)

    by gnapster (1401889) on Tuesday December 14, 2010 @09:12PM (#34555458)
    So what you are saying is, your OpenBSD box is running a version that is missing 60% of the timeline where edits could have been made to break this backdoor?
  • Re:But but but (Score:5, Insightful)

    by igreaterthanu (1942456) * on Tuesday December 14, 2010 @09:13PM (#34555460)
    Crackers don't like sharing their audit results for free.
  • by Opportunist (166417) on Tuesday December 14, 2010 @09:24PM (#34555562)

    Sure gonna. You left your fingerprint and all you are so dumb. You are really dumb. For real.

    (I can't believe how well this fits...)

  • Re:But but but (Score:5, Insightful)

    by Opportunist (166417) on Tuesday December 14, 2010 @09:32PM (#34555640)

    One of the biggest selling points of FOSS is that you can audit it at leisure, without having to go to the maker, give them a GOOD reason why you'd want to audit the source and sign NDAs with blood.

    Unaudited, FOSS is just as well audited as closed source. Duh.

    In other words, as long as everyone's too lazy/cheap/dumb to actually DO an audit, yes, FOSS is by no means more secure than CSS.

  • by Martin Blank (154261) on Tuesday December 14, 2010 @09:32PM (#34555642) Journal

    If it is true, it was submitted as source code, subject to review, accepted by the community, and installed by users. I see nothing illegal here.

    I also don't see where it's necessarily warrantless wiretapping. Sure, it could be used for that, but this kind of thing could also absolutely be used for warranted wiretapping. The FBI goes to a judge, gets a warrant, captures the traffic, and decrypts it using the backdoor. Again, nothing illegal.

    There are ethical issues with intentionally subverting such a project, but I don't see legal issues such as you claim.

  • Re:But but but (Score:2, Insightful)

    by Haeleth (414428) on Tuesday December 14, 2010 @09:40PM (#34555704) Journal

    That mantra is repeated over and over by OSS advocates almost like an incantation

    I constantly see people claim that OSS advocates use this argument. I can't remember the last time I saw an actual OSS advocate actually using it.

    Really you are fighting something of a straw man. Nobody with a clue has ever claimed that "many eyes" is some kind of magical guarantee of security. It is not news that high-profile OSS code can contain very serious flaws; just think of the Debian OpenSSL incident!

  • Re:42 Grams. (Score:5, Insightful)

    by TheLink (130905) on Tuesday December 14, 2010 @10:41PM (#34556088) Journal

    The code obfuscation competitions won't be good examples - since obfuscated code looks hard to understand, which would make it more noticeable to auditors, or even "normal programmers" looking at the code.

    It'll be stuff like "The Underhanded C Contest": http://underhanded.xcott.com/?page_id=17 [xcott.com]

    Or this: http://www.debian.org/security/2008/dsa-1576 [debian.org]
    Or "accidentally" leave in a few exploitable buffer overflows or other "normal" bugs.

    As for over reliance on "many eyes", just relying on it is over-reliance. The "many eyes" claim is not applicable when it comes to _security_ bugs.

    There are many eyes, but they're all "watching TV". They'll notice if a bug crashes their DVR or causes image corruption, other than that no.

    There are only very few skilled experienced eyes auditing the code, and not all of those are on the "defending" side.

  • Re:Oh shit... (Score:5, Insightful)

    by JeffSh (71237) <jeffslashdot AT m0m0 DOT org> on Tuesday December 14, 2010 @10:56PM (#34556198)

    While funny, it misses the bigger picture of the OpenBSD stack/code being hidden in other devices, especially vpn/firewall appliances.

  • Re:But but but (Score:5, Insightful)

    by thePowerOfGrayskull (905905) <marc.paradise@NOSpaM.gmail.com> on Tuesday December 14, 2010 @11:07PM (#34556292) Homepage Journal
    Of course... your comment serves to underscore the importance of open source. While GP noted that it *should* have been caught in OpenBSD,.. at least the potential for it to have been caught was there. If it's in Linux as well, we'll know very soon since it's reasonably certain that people are looking now. If it's in MS products... well, that's something we'll never know.
  • by NiceGeek (126629) on Tuesday December 14, 2010 @11:16PM (#34556370)

    In fact if someone like Assange would have pulled this crap back then, he'd have found himself with a fatal necktie.

  • by skids (119237) on Tuesday December 14, 2010 @11:36PM (#34556536) Homepage

    99.99% of code can be cleaned by talented enough audit freaks. Crypto code is in the other 0.01%. Proper cryptography development requires doctorate level mathematics skills.

  • Re:But but but (Score:4, Insightful)

    by sjames (1099) on Wednesday December 15, 2010 @01:05AM (#34557052) Homepage

    Use the source! There's no need to wonder, pick a likely function, audit it, and post your results!

"It's like deja vu all over again." -- Yogi Berra

Working...