Autor: T.J. Duchene Data: Para: Steve Litt, dng Assunto: [Dng] John Goerzen asks, "Has modern Linux lost its way?"
Hey, Steve! Sorry about the "tome" but it is an interesting and long-winded
subject.
“I agree that use of C has lessened, and that less of us understand C
today. I disagree that this is necessarily the cause of the problem.”
I'm afraid that is where we differ, Steve, at least in the overall picture of
all things Linux. It's been my experience that very few people compile their
own Linux. Without that, most people have resorted to other languages as
their primary language. From what I have seen, a lot of younger programmers
prefer the languages that you mention: Python, Perl, Lua, Ruby, etc.
When something new and critical written in C gets shoved into the userspace,
all hell breaks loose because a vocal majority do not understand how to “hack
it”. It's not part of the “black box” of the kernel. What I am saying is that
one of the largest sources of discontent that I hear from more and more Linux
quarters it is the fact that you have to use a C compiler at all. The fact
that you need to recompile something in order to change its behavior seems
especially exacerbating to them. IMHO, Linux users have gotten too lazy and
expect every package to be what they want, just as long as they don't have to
do it themselves. It's a trait they share with Windows users.
“Personally, I think that anything that *can* be written in Python, Perl, Ruby
or Lua *should* be written in one of those languages, because doing so limits
buffer overruns to those remaining in the language. Also, software written in
those, especially Python, which has a rich library of standard libraries, has
fewer dependencies that need to be installed. And yes, feature for feature,
they're more readable than C.”
With respect, I consider those arguments to be ungrounded.
Depending on a run-time to catch your buffer overflows is an inanely stupid
thing to do. Expecting the compiler to be responsible for catching overflows
means that you routinely aren't going to be checking your own bounds. The
design of the language might even be such that it prevents you from performing
checks you otherwise could or should. An example of this would be C#.
Microsoft does not always document which resources are marked as disposable or
not, leaving you to guess or test every single object that you instantiate or
inherit. Probably the most insidious problem with run-times of this nature,
is that if there is a flaw in the run-time, you will have the flaw in your
code and it is hard as hell to track down, because you aren't bothering with
your own checks. On open-source, it's survivable – just crawl in and fix it.
If it isn't, you're hung high and dry until a patch is issued. This is not to
mention the additional run-time complexity or the fact that it generates
significant processor overhead that can slow things down or diminish battery
life.
Personally, I'd just rather do it myself. I've never found it to be a feature
that I can depend on, nor have I found languages that force it on you to be
dependable either.
I'm sorry but I don't agree with the idea of Python and other such languages
having few dependencies due to a “rich standard library” either. All of them
use C bindings for things that they cannot do for themselves. This means that
although you are using a higher level language and supposedly removing the
possibility of C programming errors, you are still depending on C, and simply
adding another layer of complexity with more potential for error on top of
what you are already using. This is not to mention that you are now adding
whatever run-time to your typically already bloated binary Linux as a system
dependency. Systemd is another example of that, in my opinion. You must use
tools written in Python, which forces you to further subsidize Python, unless
you rewrite them. Another very hated example would be anything to do with
PHP. God forbid you ever need to use PHP for any reason. It's a disaster of
dependencies waiting to happen.
I am a fan of absolute minimal language dependencies, which is why I like C.
As for them being more readable, I can't agree with that either. “Readable
code” is resides in actually being able to understand the conventions of
language. The argument that one language is more readable than another is
complete and utter nonsense. I've seen Perl code that I've found difficult
to decipher while assembly might be completely comprehensible. It's entirely
a subjective idea and not an objective assessment.