May 5, 2006 5:47 PM PDT

Linux kernel 'getting buggier,' leader says

Andrew Morton, the lead maintainer of the Linux production kernel, is worried that an increasing number of defects are appearing in the 2.6 version and is considering drastic action to resolve it.

"I believe the 2.6 kernel is slowly getting buggier. It seems we're adding bugs at a higher rate than we're fixing them," Morton said in a talk at the LinuxTag conference in Wiesbaden, Germany, on Friday.

Morton said he hasn't yet proved this statistically, but has noticed that he is getting more e-mails with bug reports. If he is able to confirm the increasing defect rate, he may temporarily halt the kernel development process to spend time resolving issues.

"A little action item I've given myself is to confirm that this increasing defect rate is really happening," he said. "If it is, we need to do something about it."

"Kernel developers will need to reapportion their time and spend more time fixing bugs," he added. "We may possibly have a bug fix-only kernel cycle, which is purely for fixing up long-standing bugs."

One problem is that few developers are motivated to work on defects, Morton said. This is particularly a problem for bugs that affect old computers or peripherals, as kernel developers working for corporations don't tend to care about out-of-date hardware, he said.

Nowadays, many kernel developers are employed by IT companies, such as hardware manufacturers. That can cause problems, as they may be motivated by self-interest, Morton suggested.

"If you're a company that employs a kernel maintainer, you don't have an interest in working on a 5-year-old peripheral that no one is selling any more. I can understand that, but it is a problem, as people are still using that hardware. The presence of that bug affects the whole kernel process, and can hold up the kernel, as there are bugs, but no one is fixing them," he said.

Differences in a kernel
During his talk, Morton discussed the 2.6 kernel development process. He explained that if people want to get their code into the kernel they should send it to him, and not to Linus Torvalds, who maintains the development kernel. Morton manages the "-mm" code branch, which is where patches are tested before being added to the development kernel.

"The way an individual can get their code into the kernel is by sending it to me. I will buffer it in my (mm) tree and send it to Linus," he said.

"It's fairly rare for a person to send a patch to Linus and get it in. In fact, Linus is fairly random at patches at the best of times. Generally, Linus will cc: it to me because he knows I'll pick it up," Morton added.

"The mm tree is what Linus' tree is going to look like in three months time. A lot of stupid bugs get in. I wish people would send me code that compiles--probably about 75 percent do," he said. "Without mm, all of these problems wouldn't be discovered until they hit the mainline tree, and would impact everyone's ongoing development."

The LinuxTag conference goes on until Saturday. Talks that take place in the main conference room can be watched online via a free Webcast (instructions in German).

Ingrid Marson reported for London-based ZDNet UK.

See more CNET content tagged:
kernel, defect, Linux kernel, bug, tree

68 comments

Join the conversation!
Add your comment
Why No Stats? That's scary
I don't know of a single well run project that isn't running a good bug system that can provide data on request. Dump it into a spreadsheet, and chart it.

All software has bugs, but that a team lead doesn't have a reporting process in place should worry the Linux community to no end.
Posted by fogfire (21 comments )
Reply Link Flag
Worry for everyone
Not just the Linux community -- this is terrible news for those trying to raise the profile of Linux in the enterprise.

It flies in the face of all the hoohaw about open source projects being more responsive to bugs than closed source projects.
Posted by Betty Roper (121 comments )
Link Flag
He has stats...
Keep in mind that the Linux kernel maintainers
do have the stats that you'd expect for such a
project. What Morton is getting at is that
there's an increasing number of modules in the
code base that provide support for legacy
hardware for which there are currently no
reported bugs, but for which he suspects there
probably are bugs because other kernel code has
changed since last those modules were updated.
It's a problem because few people have the
hardware, so errors are rare, and reports even
more so, and there's little interest from
developers to review that stuff without a bug
report.

Keep in mind that the core kernel is no more or
less buggy than it had been before (which,
according to several companies that use code
auditing software to look at the Linux kernel,
is remarkably high quality -- better than
commercial).

Andrew's going to need get a team of people to
go back and look at things like MFM support and
do regression tests on it. There are no bug
reports, but it's wishful thinking that
something about it didn't break in the evolution
of the 2.6 kernel. After sampling a few bits of
the older code, he ought to get a feel for how
many bugs have arisen without reports of
problems.
Posted by Zymurgist (397 comments )
Link Flag
Why No Stats? That's scary
I don't know of a single well run project that isn't running a good bug system that can provide data on request. Dump it into a spreadsheet, and chart it.

All software has bugs, but that a team lead doesn't have a reporting process in place should worry the Linux community to no end.
Posted by fogfire (21 comments )
Reply Link Flag
Worry for everyone
Not just the Linux community -- this is terrible news for those trying to raise the profile of Linux in the enterprise.

It flies in the face of all the hoohaw about open source projects being more responsive to bugs than closed source projects.
Posted by Betty Roper (121 comments )
Link Flag
He has stats...
Keep in mind that the Linux kernel maintainers
do have the stats that you'd expect for such a
project. What Morton is getting at is that
there's an increasing number of modules in the
code base that provide support for legacy
hardware for which there are currently no
reported bugs, but for which he suspects there
probably are bugs because other kernel code has
changed since last those modules were updated.
It's a problem because few people have the
hardware, so errors are rare, and reports even
more so, and there's little interest from
developers to review that stuff without a bug
report.

Keep in mind that the core kernel is no more or
less buggy than it had been before (which,
according to several companies that use code
auditing software to look at the Linux kernel,
is remarkably high quality -- better than
commercial).

Andrew's going to need get a team of people to
go back and look at things like MFM support and
do regression tests on it. There are no bug
reports, but it's wishful thinking that
something about it didn't break in the evolution
of the 2.6 kernel. After sampling a few bits of
the older code, he ought to get a feel for how
many bugs have arisen without reports of
problems.
Posted by Zymurgist (397 comments )
Link Flag
It's rather disingenuous
On the one hand we have the open source model, which
by definition is open and all concerns, gripes,
comments are in the public in which all can see and
make their own judgments about and on the other we
have a proprietary model that we cannot even compare
as there is no public data to verify or compare it
too (legally). Using common sense and a little
critical thinking should lead one to the comclusion
that a closed system will always be inferior to one
in which public scrutiny can be applied to make
judgements and further improvments.
Posted by Johnny Mnemonic (374 comments )
Reply Link Flag
What sense is that?
By what "common sense" do you base such reasoning on? Thus far the closed systems have worked better. Linux cannot do the things I need to do, Windows can, therefore Linux is inferior in that context. I do not know of any context related to my work that would make Linux superior to Windows in any way. Beta testers for Windows can document and report bugs and performance issues at any time, just like any closed source software. You do not need the code out in the open to diagnose the problem, that's what the engineers are for. Simply flag a feature or action, and they will deal with it. I would never want Windows to be open source.
Posted by PurePacket (28 comments )
Link Flag
That's not common sense.
Common sense and critical thinking lead to the conclusion that testing is the only way to ensure that a system satisfies business requirements - whether that system is open or closed is irrelevant.

The article is not even about open vs. closed systems. It is about Linux. You seem to be making the argument that Linux is better than the alternatives, so the problems with open source development should be ignored. There's just no sense in that.
Posted by just_some_guy (231 comments )
Link Flag
That's rediculous
How can Johnny Mnemonic make such an absurd conclusion? A system where anyone can submit code with no responsibility as to it's integrity will always be inferiour.
Posted by DavidWorkman (14 comments )
Link Flag
It's rather disingenuous
On the one hand we have the open source model, which
by definition is open and all concerns, gripes,
comments are in the public in which all can see and
make their own judgments about and on the other we
have a proprietary model that we cannot even compare
as there is no public data to verify or compare it
too (legally). Using common sense and a little
critical thinking should lead one to the comclusion
that a closed system will always be inferior to one
in which public scrutiny can be applied to make
judgements and further improvments.
Posted by Johnny Mnemonic (374 comments )
Reply Link Flag
What sense is that?
By what "common sense" do you base such reasoning on? Thus far the closed systems have worked better. Linux cannot do the things I need to do, Windows can, therefore Linux is inferior in that context. I do not know of any context related to my work that would make Linux superior to Windows in any way. Beta testers for Windows can document and report bugs and performance issues at any time, just like any closed source software. You do not need the code out in the open to diagnose the problem, that's what the engineers are for. Simply flag a feature or action, and they will deal with it. I would never want Windows to be open source.
Posted by PurePacket (28 comments )
Link Flag
That's not common sense.
Common sense and critical thinking lead to the conclusion that testing is the only way to ensure that a system satisfies business requirements - whether that system is open or closed is irrelevant.

The article is not even about open vs. closed systems. It is about Linux. You seem to be making the argument that Linux is better than the alternatives, so the problems with open source development should be ignored. There's just no sense in that.
Posted by just_some_guy (231 comments )
Link Flag
That's rediculous
How can Johnny Mnemonic make such an absurd conclusion? A system where anyone can submit code with no responsibility as to it's integrity will always be inferiour.
Posted by DavidWorkman (14 comments )
Link Flag
It's rather disingenuous
On the one hand we have the open source model, which
by definition is open and all concerns, gripes,
comments are in the public in which all can see and
make their own judgments about and on the other we
have a proprietary model that we cannot even compare
as there is no public data to verify or compare it
too (legally). Using common sense and a little
critical thinking should lead one to the comclusion
that a closed system will always be inferior to one
in which public scrutiny can be applied to make
judgements about and further improvments.
Posted by Johnny Mnemonic (374 comments )
Reply Link Flag
I don't follow you
Why does that prove a closed system is always inferior? If that were the case, the worst possible open source system (even an operating system developed in five hours) would be better than the best closed source OS. And that's obviously not true. So your reasoning doesn't hold.
An Open Source OS will only be the better one if it is well developed, well maintained and well tested. And the problem with OSS today is that it is way more fun to write a new piece of code to solve an interesting problem or to implement cool new functionality than it is to re read systematically all the stuff that war written by someone else to make sure it is bug free, test it in every way conceivable and correct all the bugs found. Given that, if I were so prone to generalize as you seem to be, I would say that the commercial OS will always be better than the open source ones, since the corporations have paid people to do that boring work (and they are paid to do their best at it). But it would be an unfair and incorrect generalization, as yours is.
Posted by Hernys (744 comments )
Link Flag
I don't follow you
Why does that prove a closed system is always inferior? If that were the case, the worst possible open source system (even an operating system developed in five hours) would be better than the best closed source OS. And that's obviously not true. So your reasoning doesn't hold.
An Open Source OS will only be the better one if it is well developed, well maintained and well tested. And the problem with OSS today is that it is way more fun to write a new piece of code to solve an interesting problem or to implement cool new functionality than it is to re read systematically all the stuff that war written by someone else to make sure it is bug free, test it in every way conceivable and correct all the bugs found. Given that, if I were so prone to generalize as you seem to be, I would say that the commercial OS will always be better than the open source ones, since the corporations have paid people to do that boring work (and they are paid to do their best at it). But it would be an unfair and incorrect generalization, as yours is.
Posted by Hernys (744 comments )
Link Flag
It's rather disingenuous
On the one hand we have the open source model, which
by definition is open and all concerns, gripes,
comments are in the public in which all can see and
make their own judgments about and on the other we
have a proprietary model that we cannot even compare
as there is no public data to verify or compare it
too (legally). Using common sense and a little
critical thinking should lead one to the comclusion
that a closed system will always be inferior to one
in which public scrutiny can be applied to make
judgements about and further improvments.
Posted by Johnny Mnemonic (374 comments )
Reply Link Flag
I don't follow you
Why does that prove a closed system is always inferior? If that were the case, the worst possible open source system (even an operating system developed in five hours) would be better than the best closed source OS. And that's obviously not true. So your reasoning doesn't hold.
An Open Source OS will only be the better one if it is well developed, well maintained and well tested. And the problem with OSS today is that it is way more fun to write a new piece of code to solve an interesting problem or to implement cool new functionality than it is to re read systematically all the stuff that war written by someone else to make sure it is bug free, test it in every way conceivable and correct all the bugs found. Given that, if I were so prone to generalize as you seem to be, I would say that the commercial OS will always be better than the open source ones, since the corporations have paid people to do that boring work (and they are paid to do their best at it). But it would be an unfair and incorrect generalization, as yours is.
Posted by Hernys (744 comments )
Link Flag
I don't follow you
Why does that prove a closed system is always inferior? If that were the case, the worst possible open source system (even an operating system developed in five hours) would be better than the best closed source OS. And that's obviously not true. So your reasoning doesn't hold.
An Open Source OS will only be the better one if it is well developed, well maintained and well tested. And the problem with OSS today is that it is way more fun to write a new piece of code to solve an interesting problem or to implement cool new functionality than it is to re read systematically all the stuff that war written by someone else to make sure it is bug free, test it in every way conceivable and correct all the bugs found. Given that, if I were so prone to generalize as you seem to be, I would say that the commercial OS will always be better than the open source ones, since the corporations have paid people to do that boring work (and they are paid to do their best at it). But it would be an unfair and incorrect generalization, as yours is.
Posted by Hernys (744 comments )
Link Flag
What do you expect when you dont people anything!
There is an old rule that "You get what you paid for".
Of course there is exception to any rule, and perhaps Linux & Apache are some of the best exceptions to this rule, but still the rule applies.

So if you have people working for free and anonymously too, what do you except! That they are going to do a good & timely job!!
Come on!
After all, ask your self: How many of us would like to work for free?
I think that was called Slavery!
Now of course some rare ones amongst us will do volunteer work for free, but that certainly is going to be limited on so many levels.

What I am saying is that the GNU/BSD Open Source business model is flawed on so many levels, on rarest occasions it may defie the rule and create a success, again that exception being Linux, Apache and a few other products, but even these products are at the jeopardy of people "Working for free"!
Posted by caudio_roma (57 comments )
Reply Link Flag
Not an old rule.
"You get what you paid for" is a maxim, not a
rule, and where it true you'd suffocate (unless
you're buying that air you're breathing).

The fact that you can receive the code for free
has no bearing on the quality (which, to date,
still measures to be of higher quality than it's
competitors).

First, not all people working on the kernel do
so for free. There are professional contributors
that are specifically tasked to contribute
(people paid by Sun, IBM, HP, etc. specifically
to contribute to the Linux kernel), those that
contribute to very specific projects (drivers by
hardware vendors, toolkit authors, application
developers), government agencies (like the NSA),
plus students, professors, and engineers that
all get paid to contribute. I've contributed to
a number of such projects, and have been
supported by my employer to do so in the course
of my work as a computational biologist.

Even if you are coding for free, it wouldn't be
slavery unless it wasn't voluntary. Boston held
a 20-mile "Walk for Hunger" this weekend, and
I'm guessing the walkers didn't consider
themselves slaves. Nor does Habitat for Humanity
-- at least I don't think so.

Those that contribute are generally those that
are experts in their field looking to get a
platform that is best-in-breed for their work,
or they are quite specifically drawn to a
particular part of the project. In the physical
sciences we tend to use Linux for a number of
reasons: easier to tune to applications, better
APIs with more succinct and correct
documentation, better performance, and far
greater stability. To that end, we contribute
whetever we can which furthers Linux (and
various APIs) superior traits in this regard for
the benefit of our company and our peers in the
physical sciences. It may seem funny, but
scientists and engineers still share information
and tools with each other pretty freely (even in
corporations). Being PhDs in computational
science, perhaps our demands are little higher,
but it's not as if there are tech comoanies
willing to be open with their technologies or
offer it without restrictions on licenses and
such.

Open-source is only flawed from the respect that
it makes a poor proprietary software model. It
is the model that existed prior to proprietary
software companies, and it looks as though it's
the model that is performing the best right now.
Sure, it's possible to make a buck off it
( <a class="jive-link-external" href="http://finance.yahoo.com/q/bc?t=5y&#38;s=RHAT&#38;l=on&#38;z=m&#38;q=l&#38;c=MSFT" target="_newWindow">http://finance.yahoo.com/q/bc?t=5y&#38;s=RHAT&#38;l=on&#38;z=m&#38;q=l&#38;c=MSFT</a> ),
but even if you don't, the software by
definition will exist and flourish until the
interest dies, a for-profit company's products
will survive so long as it remains profitable or
until something else (perhaps not better)
replaces it.
Posted by Zymurgist (397 comments )
Link Flag
Ignorance
As was stated, many of the developers do get paid, and paid well. You think Linus has made nothing in this? Your 'working for free' argument is pointless as it is not true.

Second thing, even if the contributors make nothing as is sometimes the case, the amount of flaws found are far less then a certain propriatary company(*cough MS) and they get fixed faster. Compare 2-3 days max to months if ever.

Your slavery comments just pushes the point that you have no idea about open source, so why are you even commenting?
Posted by Bill Dautrive (1179 comments )
Link Flag
What do you expect when you dont people anything!
There is an old rule that "You get what you paid for".
Of course there is exception to any rule, and perhaps Linux &#38; Apache are some of the best exceptions to this rule, but still the rule applies.

So if you have people working for free and anonymously too, what do you except! That they are going to do a good &#38; timely job!!
Come on!
After all, ask your self: How many of us would like to work for free?
I think that was called Slavery!
Now of course some rare ones amongst us will do volunteer work for free, but that certainly is going to be limited on so many levels.

What I am saying is that the GNU/BSD Open Source business model is flawed on so many levels, on rarest occasions it may defie the rule and create a success, again that exception being Linux, Apache and a few other products, but even these products are at the jeopardy of people "Working for free"!
Posted by caudio_roma (57 comments )
Reply Link Flag
Not an old rule.
"You get what you paid for" is a maxim, not a
rule, and where it true you'd suffocate (unless
you're buying that air you're breathing).

The fact that you can receive the code for free
has no bearing on the quality (which, to date,
still measures to be of higher quality than it's
competitors).

First, not all people working on the kernel do
so for free. There are professional contributors
that are specifically tasked to contribute
(people paid by Sun, IBM, HP, etc. specifically
to contribute to the Linux kernel), those that
contribute to very specific projects (drivers by
hardware vendors, toolkit authors, application
developers), government agencies (like the NSA),
plus students, professors, and engineers that
all get paid to contribute. I've contributed to
a number of such projects, and have been
supported by my employer to do so in the course
of my work as a computational biologist.

Even if you are coding for free, it wouldn't be
slavery unless it wasn't voluntary. Boston held
a 20-mile "Walk for Hunger" this weekend, and
I'm guessing the walkers didn't consider
themselves slaves. Nor does Habitat for Humanity
-- at least I don't think so.

Those that contribute are generally those that
are experts in their field looking to get a
platform that is best-in-breed for their work,
or they are quite specifically drawn to a
particular part of the project. In the physical
sciences we tend to use Linux for a number of
reasons: easier to tune to applications, better
APIs with more succinct and correct
documentation, better performance, and far
greater stability. To that end, we contribute
whetever we can which furthers Linux (and
various APIs) superior traits in this regard for
the benefit of our company and our peers in the
physical sciences. It may seem funny, but
scientists and engineers still share information
and tools with each other pretty freely (even in
corporations). Being PhDs in computational
science, perhaps our demands are little higher,
but it's not as if there are tech comoanies
willing to be open with their technologies or
offer it without restrictions on licenses and
such.

Open-source is only flawed from the respect that
it makes a poor proprietary software model. It
is the model that existed prior to proprietary
software companies, and it looks as though it's
the model that is performing the best right now.
Sure, it's possible to make a buck off it
( <a class="jive-link-external" href="http://finance.yahoo.com/q/bc?t=5y&#38;s=RHAT&#38;l=on&#38;z=m&#38;q=l&#38;c=MSFT" target="_newWindow">http://finance.yahoo.com/q/bc?t=5y&#38;s=RHAT&#38;l=on&#38;z=m&#38;q=l&#38;c=MSFT</a> ),
but even if you don't, the software by
definition will exist and flourish until the
interest dies, a for-profit company's products
will survive so long as it remains profitable or
until something else (perhaps not better)
replaces it.
Posted by Zymurgist (397 comments )
Link Flag
Ignorance
As was stated, many of the developers do get paid, and paid well. You think Linus has made nothing in this? Your 'working for free' argument is pointless as it is not true.

Second thing, even if the contributors make nothing as is sometimes the case, the amount of flaws found are far less then a certain propriatary company(*cough MS) and they get fixed faster. Compare 2-3 days max to months if ever.

Your slavery comments just pushes the point that you have no idea about open source, so why are you even commenting?
Posted by Bill Dautrive (1179 comments )
Link Flag
I've Seen Windoze Source, And It Ain't Pretty ...
in fact, the word "fugly" comes to mind. A government agency I worked at had a license for Windoze source so that security flaws could be identified and solved without having to wait for Microsloth to get around to fixing them, if ever (it was a condition of a contract Microsloth desperately wanted that yielded big bucks for them). The vast majority of the code in Windoze is written by their least-experienced programmers who are just out of school (I won't even insult real software engineers by calling Microsloth's programmers anything like an engineer of any kind - even a "sanitational engineer", aka a garbage collector - it's pretty obvious they've never even heard of Murphy's Law, the First Commandment for Real Engineers). For most of them, Windoze is the first real product they've ever worked on, and only have their theoretical background from which to work. Another example of how the junior people get stuck with important work that no one else wants to do is that, if you break a build, you become the build-master for that build until someone else screws up, so they wind up with even more work to do in the same amount of time, which results in even more functional and operational bugs (but they don't break the build!).

I would much rather have code for which the source is available than not for many reasons, not the least of which is that, at least I can easily trace the code to see exactly what is going on, and either fix it myself, or pay someone to fix it. The vast majority of developers doing substantial work on Linux today are being well paid to do so full-time, either by one of the companies that provide fully-supported releases, like Red Hat, IBM, etc., or internal development teams in companies where Linux is used as the core of their product (TiVo may be the largest deployer of an embedded Linux, now, with over three million boxes Out There). Such companies also have substantial dedicated QA/QC/testing groups that do everything from regression tests to ensure proper original functionality by changed/augmented code, to black-box tests on new code.

It's quite natural for projects that grow beyond a critical mass to need to undergo some significant testing, quality control, clean-up, etc. At least open-source customers can fairly readily perform whatever level and type of testing they want/need, they can also prioritize and address the problems as they see fit, and not be limited to whatever schedule developers of proprietary products decide should be done to satisfy the lowest common denominator of all their customers.

As for the track record of proprietary products vs. open-source, the legion of unresolved bugs, vulnerabilities, etc., associated with the former is a sad testament to their lack of quality (and the bugs that are made public are just those we know about). There are very likely many more bugs that history has demonstrated companies don't want to publicize because of embarrassment, having to do PR damage control, and just plain reducing their profits by having to spend money doing what they should have done in the first place (it's amazing how everyone has time to fix bugs, but very few make the effort to prevent them, in the first place). I don't trust anyone to do what's in my best interests, especially when they're taking money from lots of other customers.

At least now the potential problem with bugs in the Linux kernel has been raised, and you can be sure that it will remain well-publicized, by well-heeled competitors, if not the media and customers/users, until the issues are resolved, one way or another.

All the Best,
Joe Blow
Posted by Joe Blow (175 comments )
Reply Link Flag
2 cents
Assuming that someone who cannot even spell "Microsoft" or "Windows" is qualified to judge the quality of their source code...

1) The article has nothing to do with Windows. Andrew Morton says that the Linux code has problems, and he IS qualified to judge it. The quality of Windows code is irrelvant. It would be counter-productive to sweep the issue under the carpet.

2) The majority of software users are not software developers. These users are not capable of fixing the code themselves, and have no interest in viewing it.

3) The article is solely about the open source development model, not about open source vs. proprietary code. There are problems with the open source development model, and Andrew Morton suggests that the problems be addressed. He is correct. There are likely many open source developers that share your view ("at least its better than Windoze!!!"), which negatively impacts the code. Since anyone can contribute to open source, the challenge is to weed out such riff-raff. The quality of code should not be measured against the competition, but against measurable and testable criteria.
Posted by just_some_guy (231 comments )
Link Flag
I've Seen Windoze Source, And It Ain't Pretty ...
in fact, the word "fugly" comes to mind. A government agency I worked at had a license for Windoze source so that security flaws could be identified and solved without having to wait for Microsloth to get around to fixing them, if ever (it was a condition of a contract Microsloth desperately wanted that yielded big bucks for them). The vast majority of the code in Windoze is written by their least-experienced programmers who are just out of school (I won't even insult real software engineers by calling Microsloth's programmers anything like an engineer of any kind - even a "sanitational engineer", aka a garbage collector - it's pretty obvious they've never even heard of Murphy's Law, the First Commandment for Real Engineers). For most of them, Windoze is the first real product they've ever worked on, and only have their theoretical background from which to work. Another example of how the junior people get stuck with important work that no one else wants to do is that, if you break a build, you become the build-master for that build until someone else screws up, so they wind up with even more work to do in the same amount of time, which results in even more functional and operational bugs (but they don't break the build!).

I would much rather have code for which the source is available than not for many reasons, not the least of which is that, at least I can easily trace the code to see exactly what is going on, and either fix it myself, or pay someone to fix it. The vast majority of developers doing substantial work on Linux today are being well paid to do so full-time, either by one of the companies that provide fully-supported releases, like Red Hat, IBM, etc., or internal development teams in companies where Linux is used as the core of their product (TiVo may be the largest deployer of an embedded Linux, now, with over three million boxes Out There). Such companies also have substantial dedicated QA/QC/testing groups that do everything from regression tests to ensure proper original functionality by changed/augmented code, to black-box tests on new code.

It's quite natural for projects that grow beyond a critical mass to need to undergo some significant testing, quality control, clean-up, etc. At least open-source customers can fairly readily perform whatever level and type of testing they want/need, they can also prioritize and address the problems as they see fit, and not be limited to whatever schedule developers of proprietary products decide should be done to satisfy the lowest common denominator of all their customers.

As for the track record of proprietary products vs. open-source, the legion of unresolved bugs, vulnerabilities, etc., associated with the former is a sad testament to their lack of quality (and the bugs that are made public are just those we know about). There are very likely many more bugs that history has demonstrated companies don't want to publicize because of embarrassment, having to do PR damage control, and just plain reducing their profits by having to spend money doing what they should have done in the first place (it's amazing how everyone has time to fix bugs, but very few make the effort to prevent them, in the first place). I don't trust anyone to do what's in my best interests, especially when they're taking money from lots of other customers.

At least now the potential problem with bugs in the Linux kernel has been raised, and you can be sure that it will remain well-publicized, by well-heeled competitors, if not the media and customers/users, until the issues are resolved, one way or another.

All the Best,
Joe Blow
Posted by Joe Blow (175 comments )
Reply Link Flag
2 cents
Assuming that someone who cannot even spell "Microsoft" or "Windows" is qualified to judge the quality of their source code...

1) The article has nothing to do with Windows. Andrew Morton says that the Linux code has problems, and he IS qualified to judge it. The quality of Windows code is irrelvant. It would be counter-productive to sweep the issue under the carpet.

2) The majority of software users are not software developers. These users are not capable of fixing the code themselves, and have no interest in viewing it.

3) The article is solely about the open source development model, not about open source vs. proprietary code. There are problems with the open source development model, and Andrew Morton suggests that the problems be addressed. He is correct. There are likely many open source developers that share your view ("at least its better than Windoze!!!"), which negatively impacts the code. Since anyone can contribute to open source, the challenge is to weed out such riff-raff. The quality of code should not be measured against the competition, but against measurable and testable criteria.
Posted by just_some_guy (231 comments )
Link Flag
Strange how Linux has so many more bugs then
If thats true then why is it then Linux has so many many more bugs and security vulnerabilities than Windows does?
Posted by richto (895 comments )
Reply Link Flag
Strange how Linux has so many more bugs then
If thats true then why is it then Linux has so many many more bugs and security vulnerabilities than Windows does?
Posted by richto (895 comments )
Reply Link Flag
Strange how Linux has so many more bugs then
If thats true then why is it that Linux has so many many more bugs and security vulnerabilities than Windows does?
Posted by richto (895 comments )
Reply Link Flag
You mean less...
By all available measures, Linux has fewer bugs
and security vulnerabilities than Windows.
Microsoft used to spin vulnerability reports to
make it look otherwise (for example, SANS has a
habit of issuing a separate bug report for each
Linux distribution but generally only one for
Windows, meaning you get the exact same issue
reported up to 20 times for Linux -- and SANS
will count many Linux apps as "Linux" whereas
Windows apps are considered distinct).

Presumably, Linux ought to have many more
critical bug reports because it's transparent
(everyone can see how it works), and it's a
particularly desirable target for hackers (the
most widely deployed platform for online
commerce and global telecom -- compromising
Linux servers would produce a big payoff for
fraud).

The fact that it doesn't belies good core
design.
Posted by Zymurgist (397 comments )
Link Flag
Strange how Linux has so many more bugs then
If thats true then why is it that Linux has so many many more bugs and security vulnerabilities than Windows does?
Posted by richto (895 comments )
Reply Link Flag
You mean less...
By all available measures, Linux has fewer bugs
and security vulnerabilities than Windows.
Microsoft used to spin vulnerability reports to
make it look otherwise (for example, SANS has a
habit of issuing a separate bug report for each
Linux distribution but generally only one for
Windows, meaning you get the exact same issue
reported up to 20 times for Linux -- and SANS
will count many Linux apps as "Linux" whereas
Windows apps are considered distinct).

Presumably, Linux ought to have many more
critical bug reports because it's transparent
(everyone can see how it works), and it's a
particularly desirable target for hackers (the
most widely deployed platform for online
commerce and global telecom -- compromising
Linux servers would produce a big payoff for
fraud).

The fact that it doesn't belies good core
design.
Posted by Zymurgist (397 comments )
Link Flag
Just a thought.
Maybe it would have less bugs if they had fewer lines of code. Maybe they would have fewer lines of code if they didn't try to integrate every driver available into the kernel. Maybe all software would be better if it were cost effective to write it better.
Posted by System Tyrant (1453 comments )
Reply Link Flag
It's not lines of code or drivers.
Keep in mind that Windows XP is about 45 million
lines of code, and the current Linux kernel
about 1/8th that.

It is also true that Linux does have drivers for
more hardware than XP out of the box, but it's a
little off-base to say that they are integrated
into the kernel. More than 90% of the drivers
are modularized in Linux and are typically
compiled and loaded as such. In that way, they
aren't much different than Windows drivers.

It's not the number of lines that count, but the
number of bugs. Linux has a very low defect rate
compared to average for commercial proprietary
code (according to the vendors of software that
measures these things), and a much smaller code
base. Also, writing driver modules for Linux is
far simpler than for Windows (which is why it's
popular in the development of hardware).

I'm sure software would be better if it were
cost effective to write it better. That's why
environments that distribute the cost among
multiple interested parties (like OSS models do)
typically fare better than singular parties
(like a typical corporate scenario).

Drivers are a good example. They are hugely
expensive to produce. Many vendors never release
specifications on their products and bear the
sole responsibility for writing a driver. Since
they make money selling hardware, the driver
needs only to be good enough to get the product
sold -- so, typically, there's not much
development on them and the drivers aren't
optimal for the hardware. Moreover they support
a narrow range of operating systems and for a
limited period of time (hardware manufacturers
often don't have the resources to support
multiple revisions of a single OS, much less
multiple OSs). So, much hardware support is
reverse-engineered, incurring further costs.
Vendors could publish specs on their hardware
and support more systems, while learning from
various implementations of support on those
systems, producing better drivers for more
diverse platforms at tremendously reduced cost.
Of course, many hardware vendors have also found
themselves in the position where their products
are now violating various "intellectual
property" laws, so secrecy may well trump
software development cost savings.
Posted by Zymurgist (397 comments )
Link Flag
Just a thought.
Maybe it would have less bugs if they had fewer lines of code. Maybe they would have fewer lines of code if they didn't try to integrate every driver available into the kernel. Maybe all software would be better if it were cost effective to write it better.
Posted by System Tyrant (1453 comments )
Reply Link Flag
It's not lines of code or drivers.
Keep in mind that Windows XP is about 45 million
lines of code, and the current Linux kernel
about 1/8th that.

It is also true that Linux does have drivers for
more hardware than XP out of the box, but it's a
little off-base to say that they are integrated
into the kernel. More than 90% of the drivers
are modularized in Linux and are typically
compiled and loaded as such. In that way, they
aren't much different than Windows drivers.

It's not the number of lines that count, but the
number of bugs. Linux has a very low defect rate
compared to average for commercial proprietary
code (according to the vendors of software that
measures these things), and a much smaller code
base. Also, writing driver modules for Linux is
far simpler than for Windows (which is why it's
popular in the development of hardware).

I'm sure software would be better if it were
cost effective to write it better. That's why
environments that distribute the cost among
multiple interested parties (like OSS models do)
typically fare better than singular parties
(like a typical corporate scenario).

Drivers are a good example. They are hugely
expensive to produce. Many vendors never release
specifications on their products and bear the
sole responsibility for writing a driver. Since
they make money selling hardware, the driver
needs only to be good enough to get the product
sold -- so, typically, there's not much
development on them and the drivers aren't
optimal for the hardware. Moreover they support
a narrow range of operating systems and for a
limited period of time (hardware manufacturers
often don't have the resources to support
multiple revisions of a single OS, much less
multiple OSs). So, much hardware support is
reverse-engineered, incurring further costs.
Vendors could publish specs on their hardware
and support more systems, while learning from
various implementations of support on those
systems, producing better drivers for more
diverse platforms at tremendously reduced cost.
Of course, many hardware vendors have also found
themselves in the position where their products
are now violating various "intellectual
property" laws, so secrecy may well trump
software development cost savings.
Posted by Zymurgist (397 comments )
Link Flag
It's Good to Know Someone's Watching
I mean... no one wants a "Wikipedia"-goofed-like entry in Linux... Come on...
Posted by Mendz (519 comments )
Reply Link Flag
It's Good to Know Someone's Watching
I mean... no one wants a "Wikipedia"-goofed-like entry in Linux... Come on...
Posted by Mendz (519 comments )
Reply Link Flag
 

Join the conversation

Add your comment

The posting of advertisements, profanity, or personal attacks is prohibited. Click here to review our Terms of Use.

What's Hot

Discussions

Shared

RSS Feeds

Add headlines from CNET News to your homepage or feedreader.