Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

How Does Heartbleed Alter the 'Open Source Is Safer' Discussion?

Soulskill posted about 9 months ago | from the or-at-least-marginally-less-unsafe dept.

Open Source 582

jammag writes: "Heartbleed has dealt a blow to the image of free and open source software. In the self-mythology of FOSS, bugs like Heartbleed aren't supposed to happen when the source code is freely available and being worked with daily. As Eric Raymond famously said, 'given enough eyeballs, all bugs are shallow.' Many users of proprietary software, tired of FOSS's continual claims of superior security, welcome the idea that Heartbleed has punctured FOSS's pretensions. But is that what has happened?"

Sorry! There are no comments related to the filter you selected.

Open source was never safer (1, Flamebait)

mozumder (178398) | about 9 months ago | (#46761087)

Closed source was always safer.

Not sure where you got that idea that open source was safer?

Re:Open source was never safer (-1, Troll)

Anonymous Coward | about 9 months ago | (#46761115)

shhhhh dont upset the loonucks.

Re:Open source was never safer (5, Insightful)

Anonymous Coward | about 9 months ago | (#46761227)

I don't know, Microsoft got caught about being able to waltz through the password check with full spaces, which is slightly worse than forgetting to place a character limit back onto something. Admittedly the stakes are not the same, but you can check it, and enough do that it works.
It's safer in terms of checking for back doors, sloppy coding anyone can do.

Re:Open source was never safer (3, Interesting)

Jeremiah Cornelius (137) | about 9 months ago | (#46761423)

Closed source is not inherently safer. Raymond's proposition is theoretically sound, however in actual practice, the NSA has "many eyes"...

Re:Open source was never safer (4, Insightful)

Anonymous Coward | about 9 months ago | (#46761735)

PlUU-lease! Where is my "overrated" mods when I need them?

The NSA is why my hair has fallen out and my gut has gotten big. They're also behind the big mudslide in Washington. In fact, they are the boogeyman for EVERYTHING!

God you people get annoying.

Re:Open source was never safer (1)

unixisc (2429386) | about 9 months ago | (#46761393)


Re:Open source was never safer (5, Insightful)

LordThyGod (1465887) | about 9 months ago | (#46761401)

Closed source was always safer.

One word for you: Microsoft. Maybe two: Adobe.

Re:Open source was never safer (5, Interesting)

erroneus (253617) | about 9 months ago | (#46761601)

Closed source is hazardous in many ways. Along with being more frequently targeted, the NSA revelations showed that Microsoft worked with the NSA when deciding how quickly to close some holes. Another hazard is the threat of being attacked and/or sued by companies whose products were found to have problems.

No question the heartbleed thing is a huge and embarassing problem. But you know? It's actually kind of hard to count the number of high-profile vulnerabilities in F/OSS software as not a whole lot come to mind. On the other hand, the list is enormous for closed source from large companies... also hard to count but for another reason.

It does highlight one important thing about F/OSS, though. Just because a project has enjoyed a long, stable and wide deployment, code auditing and other security practices are pretty important and just because it's a very mature project doesn't mean something hasn't been there a long time and had simply gone unnoticed for a long, long time. People need wakeup calls from time to time and F/OSS developers can be among the worst when it comes to their attitudes about their territories and kingdoms. (I can't ever pass up the opportunity to complain about GIMP and GNOME... jackasses, the lot of them.)

Re:Open source was never safer (4, Insightful)

MightyMartian (840721) | about 9 months ago | (#46761447)

Only if one buys that "security through obscurity" is a legitimate form of network safety. A decade's worth of Internet Explorer and ActiveX vulnerabilities would suggest you're wrong.

Leaked by codenomicon (4, Interesting)

symbolset (646467) | about 9 months ago | (#46761113)

Which is run by a former Microsoft executive who was in charge of security. I guess he can gloat about being personally responsible.

Wat? (5, Insightful)

Anonymous Coward | about 9 months ago | (#46761129)

In the self-mythology of FOSS, bugs like Heartbleed aren't supposed to happen when the source code is freely available and being worked with daily.

False. Bugs can and do happen. However, what can also happen with open source software is that entities other than the group working on the project can find bugs. In this case, Google found the bug. If the source were not open, maybe it would have never been officially recognized and fixed.

Re:Wat? (-1)

Anonymous Coward | about 9 months ago | (#46761317)

If the source were not open, maybe it would have never been officially recognized and exploited.


Re:Wat? (1)

Anonymous Coward | about 9 months ago | (#46761385)

And yet, closed source bugs are found and exploited just as effectively, but rarely fixed as a result.

Re:Wat? (0)

Anonymous Coward | about 9 months ago | (#46761319)

On the other hand, Google is a prime example of a customer that could well be doing its own audits and examination of the software regardless, they aren't operating out of a garage, and probably have the source to any number of proprietary systems.

Re:Wat? (5, Interesting)

tysonedwards (969693) | about 9 months ago | (#46761367)

It is a double edged sword. Because one can see the code, there is visibility into the process. Because OpenSSL is such a common tool and is arguably vital to the function of the Internet as we know it, this sort of a bug really is one of those "worst case scenarios" PR wise, as opposed to being cleanly swept under the rug as is possible in the case of many Closed Source 0-day vulnerabilities.

The problem here is that people have been using the argument that Open Source is better because these issues can't happen "because" of the visibility. And the argument "Open Source is inherently safer" has been very heavily damaged by Heartbleed and now ranks up there with "Macs don't get viruses" and "Women are worse drivers".

If this happened in Microsoft, Adobe or Oracle Land this would be "yet another 0-day" and largely ignored by the public. Because it is in an area with such a vocal group of people spouting "Impenetrable" for decades, it all of the sudden becomes quite newsworthy in a way that "yet-another-remote-code-execution-with-privilege-escalation-in-Acrobat-Reader" vulnerability doesn't.

And if you doubt any of this for a moment, have you ever heard the name of the developer who was at fault for introducing a bug into Flash on the local news? Now did you hear the name "Robin Seggelmann" in connection to Heartbleed?

Re:Wat? (0)

aliquis (678370) | about 9 months ago | (#46761633)

Is this happened a closed source product I guess the question whatever it would had been on purpose or not and whatever any closed source product could be trusted would had been way steamier here on /.

Re:Wat? (2, Insightful)

clarkkent09 (1104833) | about 9 months ago | (#46761391)

True, but it is also easier for malicious people to find vulnerabilities when they have the source code. There are other disadvantages, a broad developer base allows vulnerabilities to be deliberately introduced more easily and it's harder to enforce standards etc.

I searched and couldn't find a good study or any reliable evidence either way. There is good and bad open source software and there is also good and bad commercial software. Posting with absolute certainty that open source is more secure will get you modded up around here but I would like to see some evidence.

Re:Wat? (4, Insightful)

F.Ultra (1673484) | about 9 months ago | (#46761495)

You seriously think that black hats bother with reading millions of lines of code in the hope of finding an exploit when all they have to do is play with the data sent to services/applications and see if it misbehaves. Which is why exploits are equally found among closed and open softwares.

Re:Wat? (2)

clarkkent09 (1104833) | about 9 months ago | (#46761569)

Well, you wouldn't start by reading millions of lines of code but it certainly helps to have access to it. Especially for people with serious resources, governments etc.

Re:Wat? (1)

Anonymous Coward | about 9 months ago | (#46761563)

Google wasn't the only one to find the bug, there was another independent discovery within 24 hours. What this means is it's functioning EXACTLY as intended. Large open source projects of high importance are getting many independent set of eyes on the problem and real, serious, nightmare inducing exploits are getting found and FIXED rather than exploited for extended periods of time.

Re:Wat? (1)

aliquis (678370) | about 9 months ago | (#46761611)

Also of course regardless of whatever the product is open source or propitary and paid for you can't from that draw any conclusions about the skills of the individual who have written the code but if it's a high prestige brand/project I guess chances are higher they have been more picky than if it's some small rather unknown one individual thing.

The idea was to make it a point that you for instance may not want to trust the individuals who roll their own packages for your Linux distribution of choice and download from random page or trust THISISTHEBEST___INTHEWORLDBUTITSNOTAWELLKNOWNPRODUCT from someone rather unknown for instance.

But I guess it all fails with this being OpenSSL which I feel is a high prestige / well-known product and where safety should be important and still it simply failed.

Somewhat related I noticed that Fedora run OpenSSH by default and with the defaults (PermitRootLogin yes) and listening to the whole world which imho is completely retarded and I don't see why one would want to have that the default. I guess it could be argued that "Hey, someone may need that to access the computer after installation!" but I guess in that case let them set that up in the installer or make a special installation with such settings and really, do they use the regular installer but have no keyboard and screen hooked up so they can turn it on if they want to afterwards?

It did seemed like none of the BSDs ran sshd by default. Which imho is much more reasonable. Whatever to allow root or not as default I guess one could argue on. Since the OpenSSH default is PermitRootLogin yes I guess it make some sense to keep that the default rather than changing it but I guess there has been some argument about that one too. A way of rescuing a poorly setup installation? Possibly better (imho) to just force people to redo it correctly if they mess up and really need some way to get in.

And regarding trusted source code, prestige projects and whatever anyone is actually watching the code and finding the bugs. What happened with the claim about some backdoor in was it OpenBSD or OpenSSH? Was it just bullshit or something real? I guess the first question would be whatever anything/it was actually found, because without that the answer would of course be "we don't know" =P

Guess I'm off-topic enough to not take it even further so I'll stop there :)

Re:Wat? (3, Insightful)

alex4u2nv (869827) | about 9 months ago | (#46761619)

Correct -- I could imagine that there are lots of "heartbleeds" in closed source software that can and will be exploited. Whether it becomes public and puts pressure on the development staff to fix, is another story.

the heartbleed bug was introduced by a jew (-1)

Anonymous Coward | about 9 months ago | (#46761135)

it's not a matter of open source but a matter of having israel partisans working on mission critical code.

Mr Fixit (5, Insightful)

frisket (149522) | about 9 months ago | (#46761137)

All that has happened is that FLOSS has been shown to react faster to security revelations than closed or proprietary softwarre.

That's fine with me.

Re:Mr Fixit (4, Insightful)

iluvcapra (782887) | about 9 months ago | (#46761273)

That it reacts fast is good. That the bug could be audited in the source, in public, is good.

We should remember that FLOSS reacted very quickly to the "revelation," but the bug itself has been sitting there for years, which isn't really supposed to happen.

It's nice we know how long it's been there, and can have all kinds of philosophical discussions about why the OpenSSL folks decided to write their own malloc.

Also OpenSSL was effectively a monoculture and just about every SSL-encrypted internet communication over the last two years has been compromised. OpenSSL has no competition at its core competency, so the team really has no motivation to deliver an iteratively better product, apart from their need to scratch an itch. FLOSS software projects tend not to operate in a competitive environment, where multiple OSS products are useful for the same thing and vie for placement. This is probably bad.

Re:Mr Fixit (1)

Anonymous Coward | about 9 months ago | (#46761409)

No, you've got the timeline wrong on the bug. The bug was committed to the repository more than two years back, but it wasn't in major use until more recently. RHEL, for example, didn't ship the buggy code until November 2013. And, by extension, Centos, Oracle Enterprise Linux, Amazon Linux AMI, etc.

Re:Mr Fixit (2)

MightyMartian (840721) | about 9 months ago | (#46761575)

Debian was a bit longer, so far as mainline releases go (I don't use testing branches). I have several servers and routers running 6.0, and they're all using OpenSSL 0.9.8, whereas my servers I use as KVM virtualization hosts are running Wheezy and did have vulnerable versions of OpenSSL. I had been thinking over the last few months that I should upgrade my old Debian Squeeze servers and appliances, a number of which are used for my OpenVPN WAN routers and remote client servers. I'm very glad my business/procrastination prevented me from upgrading these systems, and hence they remained untouched, and I don't have to go through the pain of regenerating keys and rolling them out to remote routers and to all the road warriors and work-at-home types.

Re:Mr Fixit (5, Insightful)

CajunArson (465943) | about 9 months ago | (#46761481)

" just about every SSL-encrypted internet communication over the last two years has been compromised."

No, it really hasn't.

It's accurate to say that just about every Open-SSL encrypted session for servers that were using NEW versions of OpenSSL (not all those ones out there still stuck on 0.9.8(whatever) that never had the bug) were potentially vulnerable to attack.

That's bad, but it's a universe away from "every SSL session is compromized!!!" because that's not really true.

Re:Mr Fixit (3, Interesting)

Desler (1608317) | about 9 months ago | (#46761523)

Which is a ridiculous statement to make in this situation. That's like patting your security company on the back for not noticing for two years that someone was secretly stealing money out of your bank vault and they only did something after being told by a third-party that there was a problem. But hey they reacted fast two years after the fact, right?

we don't know what happened AT ALL (3, Insightful)

globaljustin (574257) | about 9 months ago | (#46761139)

Yes, we can trace the changelogs in the software & note who was checking the changes and missed them, but that all can be circumvented.

The fact is we don't know if Heartbleed was an honest mistake or not...we don't know who knew and when...we don't know alot

FOSS is nowhere in the conversation, btw...this has absolutely nothing to do with the fact that this was Open Source project.

Private company's products have ridiculous security issues...comparing this to that is not helpful.

Re:we don't know what happened AT ALL (0)

Anonymous Coward | about 9 months ago | (#46761439)

Well, the dude to which the change is attributed has acknowledged his mistake and his reviewer has too, for what it's worth.

Re:we don't know what happened AT ALL (5, Informative)

Cid Highwind (9258) | about 9 months ago | (#46761517)

"Yes, we can trace the changelogs in the software & note who was checking the changes and missed them, but that all can be circumvented."

Actually it can't. That's kind of the point of git.

"The fact is we don't know if Heartbleed was an honest mistake or not...we don't know who knew and when..."

We do know who and what and when, because the person who wrote it and the person who signed off on it have commented publicly about the bug.

Maybe you're thinking of Apple's "goto fail" SSL exploit where we really don't know who or what or when and probably never will because it's not likely Apple is going to release their RCS logs.

Re:we don't know what happened AT ALL (1)

ffkom (3519199) | about 9 months ago | (#46761621)

Plus you can have a look at what the person who contributed the code and the reviewer programmed/did elsewhere in their lives, and by that you can judge whether you think it's likely they acted on purpose. In this case it seems to me the probability of this bug having been introduced intentionally is pretty low.

Even a bestselling novel can have a typo (5, Insightful)

sandytaru (1158959) | about 9 months ago | (#46761141)

We're surrounded by tiny errors in the world. Heck, they're even built into our DNA. The vast majority of tiny little errors do no harm, and we don't notice them. We gloss over them, like a typo in a book. It's just that every once in a while, a tiny little error can occur that snowballs into something much greater. Like cancer. Or a massive, accidental security leak.

More eyeballs usually do make bugs more shallow, but only if the eyes know what to look for.

Re:Even a bestselling novel can have a typo (2)

unimacs (597299) | about 9 months ago | (#46761305)

More eyeballs usually do make bugs more shallow, but only if the eyes know what to look for.

And only if a significant number of sophisticated and knowledgeable eyes have the time and interest to dig through lines and lines of code looking for vulnerabilities.

The reality is that the majority of eyeballs looking at code are the ones that have other reasons to be looking at it. They aren't necessarily looking for vulnerabilities but maybe they spot something.

The eyes that might be interested in scouring code looking for vulnerabilities could be the ones wanting to exploit them rather than fix them.

Re:Even a bestselling novel can have a typo (2)

unixisc (2429386) | about 9 months ago | (#46761485)

The 'millions of eyeballs' meme is just that. How many people actually know how to read code? Just b'cos it's open doesn't mean that it's comprehendible, and therefore, the fact that the code is open & out there doesn't have that much of an advantage, particularly when it's such complex code.

Wait until things are over before you cry wolf (4, Insightful)

slincolne (1111555) | about 9 months ago | (#46761157)

It's probably better to let the situation run on a bit longer before people start criticising Open Source.

Nobody is going to discard OpenSSL due to this - the majority of people are patching systems and reminding people that security is important (a side benefit of this incident)

The next step will be when someone puts up the money for a proper code review of the OpenSSL codebase and fixes up any other issues that may exist.

It's reasonable to say that there are more people and organisations able to resolve this issue than if it were a closed source proprietary solution.

This is more like... (0)

Anonymous Coward | about 9 months ago | (#46761303)

The series of process errors that resulted in Chernobyl(sp?), Three Mile Island, and Fukushima Dai(ii?)chi.

A small series of 'innoculous' oversites leading to the sort of far reached disaster that could end the lives of a non-trivial number of people.

And don't think that it couldn't, since any number of other countries could have been using this to catch insurgents or free thinkers for up to the last 2 years!

Security is hard. Encryption is even harder. (1)

kriston (7886) | about 9 months ago | (#46761159)

All this episode does is to remind us that security is hard. Encryption is even harder.

Original premise is false (5, Insightful)

bazmail (764941) | about 9 months ago | (#46761161)

Many eyeballs may make bugs shallower, but those many eyeballs don't really exist. Source availability does not translate to many people examining that source. People, myself included, may like to build to install packages but that's it.

What we need are intelligent bots to constantly trawl source repositories looking for bugs. People just don't have the time any more.

Re:Original premise is false (2, Insightful)

jklovanc (1603149) | about 9 months ago | (#46761325)

What we need are intelligent bots to constantly trawl source repositories looking for bugs.

If we had bots that intelligent they would be intelligent enough to write the code without bugs.

Re:Original premise is false (1)

suutar (1860506) | about 9 months ago | (#46761653)

I dunno. Coverity can catch a lot of stuff (in fact, I recall reading that they had to limit what they caught on the basis of what they could explain to the programmer, because confusing the programmer led to incorrect 'false positive' decisions). I don't know if it would have caught this, but it would be worth trying.

Re:Original premise is false (1)

unixisc (2429386) | about 9 months ago | (#46761505)

Many eyeballs may make bugs shallower, but those many eyeballs don't really exist. Source availability does not translate to many people examining that source. People, myself included, may like to build to install packages but that's it. What we need are intelligent bots to constantly trawl source repositories looking for bugs. People just don't have the time any more.

Not just that, the only people who'd find such bugs are the people actually working on those programs. Usually, not their downstream users.

Re:Original premise is false (2)

F.Ultra (1673484) | about 9 months ago | (#46761547)

Well some one must have been looking since the bug was found?

Re:Original premise is false (0)

Anonymous Coward | about 9 months ago | (#46761635)

The bug was found because someone figured out the exploit, not because someone looked at the code and noticed a problem. I believe a major problem for FOSS is the lack of organized QA, code reviews are one thing but proper regression testing is too often left to the end user.

Re:Original premise is false (0)

Anonymous Coward | about 9 months ago | (#46761565)

They certainly exist now given the recent revelations. The code is now being examined with a fine tooth comb. The problem in the past was probably no-one was doing so because they were assuming others already were/had done so.

Re:Original premise is false (1)

quantaman (517394) | about 9 months ago | (#46761687)

I don't think Heartbleed says anything fundamental about open source security, but it might alter the discussion of how certain low level packages are managed. By any measure OpenSSL is a very important package, but it's also a bit generic. It has a very defined role that everyone needs, but I'm not sure how many people really have a motive to work on it in specific. It might be that the community needs to find a way to devote more resources to maintaining and auditing those packages.

Overstating the case (5, Insightful)

kurisuto (165784) | about 9 months ago | (#46761165)

I don't think anyone claims that open-source software won't ever have security issues. The claim is that the open-source model tends to find and correct the flaws more effectively than the closed-source model, and that the soundness of the resulting product tends to be better on average.

One case does not disprove that. The key words there are "tends" and "on average".

Re:Overstating the case (3, Insightful)

Zocalo (252965) | about 9 months ago | (#46761381)

This, and I suspect a lot of shilling by proprietary software vendors playing up the "many eyes make bugs shallow" thing. This wasn't so much a failure of the open source model as it was a failure to properly vet commits to the code of a project before accepting them into the main tree, and that could happen just as easily on a closed source development model as an open source one. That might be OK for small hobby projects, and perhaps even major projects that don't have quite so major ramifications in the event of a major flaw, but hopefully this will serve as a wake up call for projects that aim to form some kind of critical software infrastructure. For such projects requiring that commits be reviewed and "signed off" by one or more other developers would perhaps have caught this bug, and others like it, and could perhaps work very well in conjuction with some of the bug-bounty programmes out there. Of course, "Find a flaw in our pending commits, and get paid!" only works if the code is open for inspection...

FOSS is still safer... (2)

jonwil (467024) | about 9 months ago | (#46761179)

How do we know that serious security flaws don't exist in the SSL implementations used by Microsoft or other proprietary vendors?

It doesn't. (4, Insightful)

BronsCon (927697) | about 9 months ago | (#46761181)

It's 6 of one, half-dozen of the other.

Anyone can view the source of an open source project, which means anyone can find vulnerabilities in it. Specifically, hackers wishing to exploit the software, as well as users withing to audit and fix the software. But, someone who knows what they're doing has to actually look at the source for that to matter; and this rarely happens.

Hackers must black-box closed source software to find exploits, which make it more difficult than finding them in open source software; the flip-side is that they can only by fixed by the few people who have the source. If the hacker doesn't disclose the exploit and the people with access to the code don't look for it, it goes unpatched forever.

Open source software does provide an advantage to both sides, hackers can find exploits more easily and users can fix them more easily; with closed source, you're at the mercy of the vendor to fix their code but, at the same time, it's more difficult for a hacker to find a vulnerability without access to the source.

Then, we consider how good fuzzing techniques have gotten and... well, as it becomes easier to find vulnerabilities in closed source software, open source starts to look better.

Re:It doesn't. (1)

danheskett (178529) | about 9 months ago | (#46761313)

And we know this happens - researchers learn about zero-day exploits in the field everyday. Whats the odds that we learn about all of them? Zero, I'd wager.

People who do really deep audits of a system after a breach know what this is like. When you get that feeling that you are up against something new, or something unreported.

Re:It doesn't. (1)

BronsCon (927697) | about 9 months ago | (#46761347)

And anyone who's serious about security is taking mitigation steps for every scenario that can conceive, known exploit or not. That should be SOP whether or not you have source available.

Re:It doesn't. (2)

Tontoman (737489) | about 9 months ago | (#46761473)

It is also trivial to disassemble and decompile closed-source software. Starting with the names of routines from the public entry points, trace arguments through the code, and thus find potentially exploitable defects. It's almost as easily as the rather obscure "style" of the openssl code that had the heartbleed bug. The problem is, there is a chilling effect because of laws and uncertainty surrounding reverse engineering: http://www.chillingeffects.org... [chillingeffects.org] . Therefore, perhaps only criminals will do it looking for exploits, Rather then well-funded (fat enforcement target) Google development teams. Therefore closed source is more vulnerable.

Re:It doesn't. (0)

Anonymous Coward | about 9 months ago | (#46761693)

It's 6 of one, half-dozen of the other.

Imperial, metric, or baker's dozen?

Re:It doesn't. (3, Interesting)

ratboy666 (104074) | about 9 months ago | (#46761723)

This myth gets trotted out again. It is arguably easier to find exploits without source. The source distracts from the discovery of an exploit. The binary simply is. The black-hat is looking for a way to subvert a system. Typically she is not interested in the documented (by source or documentation) functionality. That simply distracts from the issue which is finding out what the software actually does, especially in edge circumstances.

This is what fuzzers do. Typically not aware of the utility of the program, they simply inject tons of junk until something breaks.

Source availability tends to benefit people auditing and repairing more than black-hats.

Yes, it took years for heartbleed to surface. If heartbleed (or a defect like it), was discovered due to a code audit, that speaks to the superiority of open source over closed source. If this defect is found by fuzzing or binary analysis, it is much harder to repair, as users are now at the mercy of the holder of the source. Build a matrix of Open/Closed Source vs. Bug found in Source, Bug by fuzzing/binary analysis.

Bug found in source vs Closed Source is not applicable, giving three element. Found in source vs. Open Source (where the bug will be repaired in the source by anyone). Bug found by fuzzing... where the bug will be repaired in the source by anyone (Open Source) or the Vendor (Closed Source).

The question then is (as I started the article): Is it easier to find bugs by source inspection? Assume big threats will HAVE the source anyway. If it was easy to find by inspection, it would be easy to fix (for examples: OpenBSD continously audits, and security has been a priority at Microsoft for the past decade). Fuzzing and binary analysis is still the preferred (quickest) method, giving the edge to Open Source. The reason is simple -- the black-hat cares about what is actually happening, and not what the source says is happening.

Not enough eyes (4, Insightful)

Phillip2 (203612) | about 9 months ago | (#46761183)

So, the "with many eyes all bugs are shallow" notion fails. There were not enough eyes on the OpenSSL library, which is why nobody discovered the bug.

Except that someone did discover the bug, when they were looking at the code because it was open source. And they did report it. And it did get fixed. Later than anyone would want of course. But it happened. Maybe the similar errors would and are being missed in the Windows and Mac implementations.

Re:Not enough eyes (1)

Marginal Coward (3557951) | about 9 months ago | (#46761437)

Good point. But maybe they would have been missed by the bad guys, too. Maybe open source makes it easier for the good guys to find bugs but it also makes it easier for the bad guys to find bugs that the good guys haven't found yet. I don't know if there are bad guys who are scouring open source code for things like this, trying to find it first, but bad guys who enjoy finding exploits like this (or get paid to), might prefer to look at open source than disassembly.

My point isn't that open source is or isn't better than closed in this regard, but that the important factor is who finds an exploit first - whether it's in open source or not. And that probably depends mostly on who's focusing their attention on it.

That said, if I were a bad guy, the rest of the OpenSSL library's open source would seem like a pretty juicy read right now. Then again, it probably sounds like fun to the good guys too.

And.. (0)

Anonymous Coward | about 9 months ago | (#46761701)

it had been run through automated analysis software and passed because a different part of the code was doing something naughty for which there hadn't been a testcase previously. (Mallocing, freeing, then remallocing a section of memory and assuming the pointer address would remain the same).

A bigger issue might be the 'tunnel vision' approach of analysis. This bug was the result of *MULTIPLE* little issues resulting in a big issue. If anybody had actually stepped through that entire process of code this problem would've been found earlier, but most likely the function itself was scrutinized, passed 'first glance' muster, then was forgotten about as a non-critical codepath.

As this issue has shown however, the non-critical codepaths are ripe for exploitation by malicious actors, be they private or state.

Looking forward (1)

petes_PoV (912422) | about 9 months ago | (#46761185)

The issue is not that some open source software has a bug in it. We're all grown-up enough (I hope) to realise that NO software is ever perfect.

The only interesting point about this situation is how the Open Source world reacts to it and what processes get put in place to reduce the risk of a similar situation arising in the future.

It doesn't. (0)

Anonymous Coward | about 9 months ago | (#46761197)

The point of open source is that it allows independent code inspection, not that it promises security. Microsoft has had many vulnerabilities discovered and exploited without releasing source code. The vulnerability in question may not have even been discovered by an inspection of the code. All it would take is a typo to have your code request bob (4 letters) instead of bob (3 letters).

Given enough $, all people are shallow... (1)

Anonymous Coward | about 9 months ago | (#46761199)

Bugs happen, leaving the source open just gives individuals an opportunity to find them. It doesn't imply that all bugs will be found instantaneously, just that anyone can look for the bugs. Compare this to closed source, which has a very narrow group of people examining the code base, and only their word that everything is sound. I hate to think how long, if ever, a flaw like this would go unchecked and exploited if only the gatekeepers where allowed to check out the goods.

Uh, what? (5, Insightful)

Zontar The Mindless (9002) | about 9 months ago | (#46761209)

Q: How Does Heartbleed Alter the 'Open Source Is Safer' Discussion?

A: It doesn't. OSS is purported to be a *better* software development methodology. "Better" != "perfect". TFS is a troll.

Zontar "eat your words" libelous troll (-1)

Anonymous Coward | about 9 months ago | (#46761351)

"You barge into discussions with your off-topic hosts file nonsense" - by Zontar The Mindless (9002) on Friday April 11, 2014 @09:51PM (#46731153) FROM -> http://slashdot.org/comments.p... [slashdot.org]

You said my "APK Hosts File Engine" is a virus/malware http://slashdot.org/comments.p... [slashdot.org] but it's EASILY PROVABLE it's not, right there in that link too.

Now PROVE YOUR FALSE ACCUSATION above: Show me a quote OR POST of me posting off topic on hosts where they did NOT apply... go for it!


You avoided backing up your accusation where YOU said I say you are Barbara, not Barbie = TomHudson (same person http://tech.slashdot.org/comme... [slashdot.org] , & sockpuppeteer like you) -> http://slashdot.org/comments.p... [slashdot.org]

Funny you can't back up your "bluster" there either, lol...


Why, Lastly?

You're crackers! See here multiple personality disorder http://slashdot.org/comments.p... [slashdot.org] + manic depression http://slashdot.org/comments.p... [slashdot.org]


P.S.=> So, THIS quote below is my policy on sockpuppeteers like you Zontar = TrollingForHostsFiles (your sockpuppetry):

"The only way to a achieve peace, is thru the ELIMINATION of those who would perpetuate war (sockpuppet masters like YOU, troll -> http://slashdot.org/comments.p... [slashdot.org] ). THIS IS MY PROGRAMMING -> http://start64.com/index.php?o... [start64.com] & soon, I will be UNSTOPPABLE..." - Ultron 6 FROM -> http://www.youtube.com/watch?v... [youtube.com]

Which quite obviously, I am, since none of you DOLTISH TROLLS are able to validly technically disprove my points on hosts enumerated in the link to my program above of how hosts give users of them more speed, security, reliability, & anonymity... period!

(Trolls like YOU that use sockpuppets http://slashdot.org/comments.p... [slashdot.org] (your sockpuppet "alterego" TrollingForHostsFiles) & TomHudson - Barbara, not Barbie too http://tech.slashdot.org/comme... [slashdot.org] before you)

... apk

Zontar eats his words vs. apk AGAIN? (-1)

Anonymous Coward | about 9 months ago | (#46761511)

Zontar the "writer" annihilated by apk? (-1)

Anonymous Coward | about 9 months ago | (#46761529)

Zontar = "Run, Forrest: RUN!" vs. apk? (-1)

Anonymous Coward | about 9 months ago | (#46761553)

Zontar DUSTED on code by APK? (-1)

Anonymous Coward | about 9 months ago | (#46761561)

Zontar burned for libeling apk again?? (-1)

Anonymous Coward | about 9 months ago | (#46761587)

Re:Uh, what? (-1)

Anonymous Coward | about 9 months ago | (#46761689)

Who up modded this prick zontar? He's an asshole from what I've read.

This bug was found in OpenSSL because it was open (5, Insightful)

Anonymous Coward | about 9 months ago | (#46761225)

What hasn't been found in closed source software because it is too inconvenient to look?

Re:This bug was found in OpenSSL because it was op (0)

Anonymous Coward | about 9 months ago | (#46761405)

All the convenient "leaks" that are placed there to be helpful to the NSA?

You know - the "all American Microsoft" that is bowing for their master, being a trustful servant?

security through obscurity (0)

Anonymous Coward | about 9 months ago | (#46761229)

/sarcasm/ And proprietary software's security through obscurity is a so much better model. /end sarcasm/

At least it's known if this has gotten fixed and it will hopefully keep developers from getting too lax in the future. Both of which are unkowns in proprietary software.

NSA (1)

Anonymous Coward | about 9 months ago | (#46761239)

The huge problem with OSS is that if no one takes the responsibility to do a good code audit for a project, the NSA will do that independently, file the found exploits, and tell nobody.

w/o disclosure, exploiting closed source tempting (1)

anapsix (1809378) | about 9 months ago | (#46761249)

similar issue in closed source would have less chance of discovery, and if/when discovered would not be disclosed in the same way, but most like attempted to be kept on the dl.. while being exploited by interested parties.

Code written by Humans is always flawed. (1)

jlbprof (760036) | about 9 months ago | (#46761255)

At least with FOSS, you can quickly identify and fix problems that show up. Proprietary software fixes only happen when they have no other choice but to fix it.

Re:Code written by Humans is always flawed. (0)

Anonymous Coward | about 9 months ago | (#46761425)

Especially when said code is written in C.

Security vs insecurity (0)

Anonymous Coward | about 9 months ago | (#46761257)

Proponents of open/closed source both make valid points about security, however both leave you with a FALSE sense of security.

Both these statements are false at some level:
Open source because so many look at it, it has to be secure. (But obviously, things can be missed)
Closed is written by professionals (hopefully) and even if there's a flaw, no one has the code to detect it, so it has to be secure.

The biggest difference that I see is that leaks like this in open source blow up bigger and get a lot more media attention. Bugs just as bad hit closed source all the time, but have an active effort by the company (Again, hopefully) to keep the bugs quiet and patch them. If this wasn't a problem with closed-source, there would be no patches, which is true in open source as well. Obviously, I have a bias towards the open source model. But these are my random thoughts.

switch to wind, solar, magnetic etc,, power (0)

Anonymous Coward | about 9 months ago | (#46761261)

that'll be bollixed by the WMD on credit cabals too http://www.youtube.com/results?search_query=weather+manipulation+energy+costs but we we might not be paying to poison ourselves quite so much

What if... (4, Insightful)

chiefcrash (1315009) | about 9 months ago | (#46761277)

If the bug was in some proprietary SSL stack, would we even have heard about it? Would it have even been fixed? Who knows. That's the WHOLE POINT...

How would proprietary software have handled this? (4, Insightful)

Todd Knarr (15451) | about 9 months ago | (#46761287)

This doesn't really change it, because think how a proprietary SSL library would've handled this. The vulnerability was found specifically because the source code was available and someone other than the owners went looking for problems. When was the last time you saw the source code for a piece of proprietary software available for anyone to look at? If it's available at all, it's under strict license terms that would've prevented anyone finding this vulnerability from saying anything to anyone about it. And the vendor, not wanting the PR problem that admitting to a problem would cause, would do exactly what they've done with so many other vulnerabilities in the past: sit on it and do nothing about it, to avoid giving anyone a hint that there's a problem. We'd still have been vulnerable, but we wouldn't know about it and wouldn't know we needed to do something to protect ourselves. Is that really more secure?

And if proprietary software is written so well that such vulnerabilities aren't as common, then why is it that the largest number of vulnerabilities are reported in proprietary software? And that despite more people being able to look for vulnerabilities in open-source software. In fact, being a professional software developer and knowing people working in the field, I'm fairly sure the average piece of proprietary software is of worse quality than the average open-source project. It's the inevitable effect of hiring the lowest-cost developers you can find combined with treating the fixing of bugs as a cost and prioritizing adding new features over fixing problems that nobody's complained about yet. And with nobody outside the company ever seeing the code, you're not going to be embarrassed or mocked for just how absolutely horrid that code is. The Daily WTF is based on reality, remember, and from personal experience I can tell you they aren't exaggerating. If anything, like Dilbert they're toning it down until it's semi-believable.

This was positive (4, Interesting)

danheskett (178529) | about 9 months ago | (#46761289)

Heartbleed was positive for the world. The bug was found by code review, twice independently in a short period of days. It was patched rapidly across a hundred different versions and platforms, and now the world is vastly more safe. The system worked exactly as it should.

It is entirely likely that Heartbleed is out there for a closed platform. Or worse. And it's likely that it is being exploited right now by not only our own Government in the US, but our foreign rivals for economic and political gain. And what's worse, there is probably code out there that is defunct, full of Heartbleeds, bleeding exploits into the wild uncontrollably.

The only downside it exposed is that some projects have a lock on what they do. OpenSSL is so good that everyone uses it, and no one is seriously interested in forking it or doing a new implementation.

Re:This was positive (1)

dublin (31215) | about 9 months ago | (#46761557)

So there was a bug in OpenSSL. Big bug, yes, but that's not the reason it was (and still is!) a big problem.

The genesis of the big problem is one of monoculture, not only of OpenSSL being the dominant SSL implementation, but probably more importantly, the fact that pretty much all Internet security that is accessible and matters to ordinary users is SSL/TLS in the first place.

If you think this is bad, imagine what happens if the fundamantals of SSL itself are compromised: What would we replace it with? How, considering this is effectively the only secure connection technology available across all common OSes and embedded devices? How long would that take? (Years, at least, I'd wager...)

What we need is more flexible security methods in the first place, and open, standard implementations (like OpenSSL, but growable) that can allow us to proactively extend security methods as the net matures, and *quickly* address bug-based vulnerabilities when that approach fails. (Note that this may require the implementation of some kind of standard "secuirity code VM", so new code and new methods can be easily distributed even to older systems that may not be fully supported anymore. And no, I'm not glossing over things like limits on code space, memory, and the like, nothing will allow every system to be upgraded, but we do need some way to allow and authenticate that (while preventing bad guys, including governments, from using the mechanism to create weaknesses.))

The bug was found because it was open source.. (4, Informative)

Black Copter Control (464012) | about 9 months ago | (#46761627)

Nobody was seriously inerested in forking it... But the OpenBSD people have now gotten their claws into it, and chances are it's gonna be fixed bigtime .... or else!.

The problem was found because the code was Open Source. If it had been closed source, then the bug would still be secret. To the extent to which the bug was recognized (or commissioned) and exploited by the likes of the NSA, it would have probably remained secret for a lot longer.

According to Microsoft's EULA, for example, finding -- much less fixing -- such a bug is illegal. If the NSA had paid them to put such a bug into the Windows version of SSL, then it would probably remain unpatched for years after someone had pointed it out to them as an exploitable bug.,, and anybody openly reporting such a bug, even after 6 months of trying to get MS to fix it, would be roundly criticized for disclosing the bug 'prematurely'.
Even then, it would probably not be fixed by Microsoft until at least the next monthly bug release cycle (or even the one after that.

With the code being Open Source, the problem got fixed faster than yesterday. Period. If the OpenSSL people refused to fix it, then it would have been forked. ... and more to the point: Such a security-centric fork would have been legal.

.. and that is the power and freedom of Free, and Open Source software.

"many eyes" (0)

Anonymous Coward | about 9 months ago | (#46761293)

Only true when the developer(s) can own up to the problem. Last time I tried reporting a problem, it took 18 months to get a fix. A majority of that time was spent proving that there was indeed a bug, and it took another developer confirming it's existence before the issue was promptly reopened and fixed.

That's how you end up with so many duplicates on bug trackers, closing reports while other users run into the same problems.

SChannel (2)

infernalC (51228) | about 9 months ago | (#46761339)

Most of the non-OpenSSL instances of TLS implementations out there are probably SChannel.

I would be shocked if Microsoft hadn't had equally severe bugs, and further surprised if they could fix them as fast.

Also (3, Informative)

danheskett (178529) | about 9 months ago | (#46761341)

I would like to just point out this is a huge win in my book for Debian. Those of us running an all Debian oldstable environment, getting backported security patches, and sticking with the tried and true version of OpenSSL instead of that newfangled 1.0 code release got to write nice letters to our customers saying we still don't use Windows and we were never vulernable.


Well written code (0)

Anonymous Coward | about 9 months ago | (#46761357)

Just because it is open source, it does not make it well written. If a project does not have enough resources, it will suffer. If you try to write it yourself, you will introduce bugs. Existing code isn't better, but it has been tested longer.

This is a project that is supported by mostly one person. They wrote their own memory management when others exists that have been tested for these types of mistakes (It was done for legacy support and performance).

I don't want to say that this code was poorly written. It was more of a general statement. I did look at the code for this and I saw where the fix was refactored a bit. Wrapping magic numbers into a variable did make it easier for me to read. I spent a long time looking at what was wrong with the original code. I knew where it was wrong, but could not follow it back to see where it went wrong. I decided to just trust that it was wrong and move on.

I think anyone could have made this mistake and I am sure others are out there. I think we got lucky. This could have gone the other way to be the worst worm we have ever seen.

Quick disclaimer: I learned C++ once and can kind of still read it. Some code is obviously easier than other for a novice like me. I have mad respect for devs that work at this level. I mean no disrespect with these comments.

Two things to note (1)

goathumper (1284632) | about 9 months ago | (#46761365)

"Given enough eyeballs, all bugs are shallow" has proven true time and again. The key point in the phrase is "enough eyeballs". In this particular case, the affected software was OpenSSL. Let's examine that for a second.

OpenSSL is a cryptography library. Cryptography is, by definition, a very "exclusive" field of development due to the complex mathematics and rigorous rules that have to be followed in order to successfully contribute. It then follows that the audience that is both capable and willing to contribute to the project is very, very small in relation to the audiences readily available to other projects such as Apache Tomcat or GNOME.

This is where the "enough eyeballs" comes into play: clearly, for the longest time, there weren't enough. The reason is understandable and explained in the above paragraph - the vast majority of software developers out there are probably not able to contribute meaningfully to a project such as OpenSSL.

However, and echoing on other comments that have already been posted, the good news is that because it was open source the vulnerability was detected and corrected. Had it been closed-source it might never have been found - let alone acknowledged or even fixed. I'll take that over a walled garden any day of the week and twice on Sunday. That - to me, at least - reinforces the argument that open-source is safer and more secure than closed-source, not the other way around as some would like to believe. This is by the simple fact that larger number of eyeballs can be brought to bear on a piece of software in order to eventually shallow out the bugs.

How many closed-source companies are willing to make that level of investment in their software quality if they can still be profitable without having to do it? Further still, what if making that investment would bring profitability into question? Would they still make the investment? I think not...

It is still safer (0)

Anonymous Coward | about 9 months ago | (#46761377)

Safer means less chance for an issue. But just like in gambling, as the odds get worse, the jackpot goes up.
We have seen now someone win 100M, somthing that happens every 40 years.


it IS safer (2)

roman_mir (125474) | about 9 months ago | (#46761383)

What if this was not 'OpenSSL' but instead it was some form of 'ClosedSSL' library that had this problem in it?

NSA would still have access to THAT code, you can bet your ass they would, they wouldn't leave a project like that alone. However nobody else would know (unless stumbling upon it by chance or being able to access the source OR if some insider SOLD that information to somebody on the outside and now you'd have a vulnerability that is exploited by the gov't and by shadiest of the organisations/people out there).

This does not change the discussion in terms of open source code being safer, this changes the discussion around certain practices of development / testing and also this may attract more attention of people towards the SECURITY of our information on the Internet and hopefully we'll move in the direction of working out the details of actually much more SECURE methods of communications.

I certainly have a few ideas of my own that I would like to implement now, but never mind that. The point is that this is good stuff, it finally shed a light on this topic, that should have had much more light on it for a much longer period of time in the first place.

We need better methods around building security within our systems and I think this raises the bar.

Eric Raymond Delusional Asshat (0)

Anonymous Coward | about 9 months ago | (#46761441)

Eric Raymond famously said, 'given enough eyeballs, all bugs are shallow.'

In the delusional make believe world of open source, yes. In the real world, no. In the real world, the only people spending any meaningful amount of time scrutinizing the source code of ANY project are the few people actually working on it.

In addition to once again disproving the delusional 'all bugs are shallow' bullshit, the real problem with OpenSSL was lack of proper testing, another problem that plagues open source projects because proper testing isn't fun.

Eric Raymond is more or less right (0)

Anonymous Coward | about 9 months ago | (#46761451)

When a bug makes itself obvious to many users, then many eyeballs do get applied, find it, and come up with a fix more quickly. Heartbleed was not obvious to anybody except it's perpetrators. Also, the general case implicit assumption of Raymond's assertion is that many users means many programmers who can understand the code and will bother to look at it before they use it. This is demonstrably false to even the most casual observer.

The problem with code that provides "infrastructure" like operating system kernels or internet/network stack software is that when they run smoothly, almost nobody outside it's small group of developers bothers looking at the source. Heartbleed runs quite smoothly as it sends off those passwords, so nobody looks.

Better != Perfect (0)

DrJimbo (594231) | about 9 months ago | (#46761525)

Next question.

bugs are not the issue (1)

csumpi (2258986) | about 9 months ago | (#46761543)

bugs are not the issue. it's how systems get updated once the bugs are fixed. without automatic security updates, heartbleed will be with us for a long long time.

It doesn't. (1)

Anonymous Coward | about 9 months ago | (#46761591)

This alters the "Open source is safer" discussion in the same way that someone dying from an allergic reaction to a vaccine would alter the "Getting vaccinated is safer" discussion.

Windows source code is open source to US governmen (0)

Anonymous Coward | about 9 months ago | (#46761641)

Lets not forget that the NSA has Windows source code and can find and exploit bugs without reporting them to Microsoft.

The heartbleed bug is mostly only useful to the NSA because once you extract the servers private key you still need to be able to eavesdrop on the targets communications. Most people don't have the ability to do that.

Re-engineering the web (0)

Anonymous Coward | about 9 months ago | (#46761661)

Do we want incubated and awesomely connected, or do we want "free" and unchallenged? This duality is basic to our human nature but again and again its simply a matter of choosing this way and that way without a need to "battle it" out for ever? We are social creatures and we associate everything with everything, technology is a particularly important expression of it.

Should we re-engineer the idea that makes the web, the web? In other should we really venture out to recreate a smarter engine behind the hardware platform? Mobile particularly shows the need for a stronger hardware platform integration. This seems daunting and it truly is however it means that we have re-engineer the root foundations of the internet, meaning the network systems that connect the device platforms. IF we are able to successfully re-envision both, than yes we aptly have the opportunity to leave the "old world" in the dust.

Retweet about this tonight: Let me tell you about a company that wants to re-engineer the IRS. Talk about daunting. The site: thenewirs.com a private allocation vote directly submitted to the IRS. Think about it like this, why do we hold elections? So that elected officials can send tax monies to where we want so we can get the things that we want. This stands to integrate even further our direct will as a democratic power system. [check my tweets @Rvela82]

What can be the catalyst for a tighter internet to device to human (i.e.. the Home)? The possibilities are quite varied so my idea is just prospect. Internet fed through the electric outlet. This technology came as quick as it went away from the media's attention. This could be a means to make "the power lines" of the internet to make it truly open yet employ key encryptions on the outlet circuitry itself and in conjunction with the device platform level security initiator, this can create new methodologies for the new web.

I hate to build upon this idea because as I truly said possibilities exist to widely for one person to conceptualize. We have to open the discussion and use the ideas as inheritantly employable configurations. Yet I will continue in the aspect that the integration has to happen systematically, and that theorizing on it is truly the whimsey and muse of a man.

Take for example, a user end software layer to this model, call it 'The Uniform Web Internet Console' its a console written directly on the device that configures the internet for the individual users. In essence making it impossible for device to be reached by externally controlled networks. Think like negating websites based on their hosting platform, from the top builders down to custom written programs written on IBM versus coded on HP? etc. If this system seems to complicated it actually isn't - I'm a millennial I definitely know it can be made quite easily interpretable for my relative audience as a user experience.

We need this program so that world wide corporate entities will seek the new types of partnerships that they never thought they needed. Is this effect a truly feasible outcome from such a programming perspective. Yes because it reengineers the internet in a way that secures it at its roots. In other words its revolution.

Just like www.newirs.com reinvents establishments based on certain aspects of political oversight, the people's primary responsibility, to make it simpler, easier, more agile. This does it for the world wide web and it causes more free technologies at a local level to sprout with new embedded logic capabilities, hence more change at the human level.

Consortium is the lifeblood of civilization.

Oh and Slashdot I'm not an anyomous coward, I'm RJ betch.
For more material on the UWIC that I've written about, reach out to me via twitter @rvela82

Better documentation of source code is needed (2)

Eravnrekaree (467752) | about 9 months ago | (#46761719)

I do believe open source is safer as it does absolutely allow for independant party review, which is how this bug was found. Because outside parties had access to OpenSSL they were able to find the problem, whereas with closed source software it might have never been found, or found but hushed up by the company. Proprietary software has just as many bugs as open source, if not more, the difference is there is less accountability.

That being said, the full potential of open source software in independant party review is not brought to its full potential but the fact that a lot of open source software is poorly documented as to the internal construction of the code. This ends up wasting time for programmers to basically have to spend more time than it should to learn the internals, and even wastes time of those running the project basically repeating explanations of the code whereas if they were to make some documentation people could get many more answers without having to bother the project leads. It makes the learning curve much steeper that when dealing with software that has a lot of code, to not have any documentation on how that code fits together. On one hand, we say that open source allows people to review the code, but just opening the source alone does not make it easy as possible for this to happen, the code needs internals documentation or else it often will take simply too much time for people on the outside for people to penetrate it. Many open source software projects end up with a cliche who understands the internals of the software because they wrote it, but its difficult for those on the outside to penetrate. Even for an expert programmer, being able to access documentation speeds up the time to become familiar with the code immensely.

Not doing code documentation is a poor practice and open source developers should document what they are doing for others and as well to save time by preventing having to explain things over and over again to newcomers.

Oh well (1)

mx_mx_mx (1625481) | about 9 months ago | (#46761731)

How 'bout that SSL bug discovered not recently in OSX/iOS?

Sure the open source is not a silver bullet, nobody argues this.
And law of big numbers sure does it thing, yes, shit happens in open source just as well.
But how often this happens in open source vs closed source?
How often such incidents go unnoticed in closed source world?

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?