:: Re: [Dng] garbage collection
Kezdőlap
Delete this message
Reply to this message
Szerző: T.J. Duchene
Dátum:  
Címzett: dng
Tárgy: Re: [Dng] garbage collection

Hi Henrick!

> There are times when garbage collection is appropriate, and times when it

is not.
> Most of the time it doesn't hurt, and is a great help for ensuring

correctness.

There can be an argument made for that, that is true. However, there can be
arguments against it which are, in my mind at least, far more serious and
compelling.

1) GC rewards and encourages bad practices.

It encourages programmers to rely on the runtime for cleanup. It makes
training programmers in proper memory management problematic when they are
working with languages not used on the application level, e.g. assembly, C,
C++. It trains them in a paradigm that is neither constant, nor 100%
reliable. One of the first things that I learned in the first computer
science course that I took years ago is that you cannot depend on the
computer to watch itself. You must take the necessary steps to ensure
robust correctness. Designing languages that attempt to do that for you
creates object code that is inherently unreliable.


2) Major memory flaws in the GC.

f there are flaws in the GC, code that would otherwise be reliable, leaks -
eventually causing the system to crash. You can argue that any flaws in the
runtime cause problems, I agree. The fact that GC has become a mandatory use
feature for languages, especially proprietary ones, increases this problem
significantly. The fact that often you cannot fix the proprietary runtime
yourself makes matters far worse. You are literally unable to fix the
problem, on any level, and held hostage with no way to write code that can
at least mitigate it.

3) Resources

Just having GC not using it, increases the resource requirements of the
runtime and diminishes system efficiency significantly. This is a fact that
no one debates. GC creates significant overhead. By having GC, you have
to depend on it to release resources so that they can be reused, and might
have to wait. It's called "late deallocation". You never know precisely
when resources are released to the OS, which makes planning to using the
absolute minimum resources impossible. They are never released immediately,
and can lead to system exhaustion scenarios that you cannot predict.


4) Problem/Solution Paradox.

GC was created to solve a problem, that it literally cannot solve.
Regardless of how well your GC works, you will never prevent memory leaks
all the time. At this point, you would nod and say "yes, but so what - half
a loaf is better than none". So, GC does not solve the problem, and it
doesn't make it worse. Or does it? At least some of the time you are
required to make certain that resources are released manually. If the
documentation is spotty - for example, the majority of opensource have poor
documentation - how are you supposed to know which resources require release
and which are collected? In most languages if you override the collector to
make sure, you can get questionable results. All of this leads back to
something bad: unreliable code.

The truth is pretty hard to ignore when non-GC languages have more use
scenarios that GC ones. Not only does GC not solve the problem, it makes
the language more complex. Its real-world use and reliability is actually
diminished. What is the point?


> I think that's the right approach. It simply isn't an all-or-nothing

matter.

I would respectfully disagree. I think language design would be better
served by instead of trying use GC, that the compiler flagged any code that
allocates but does not release resources as an error. The goal is to make
more reliable code, not to muddy the water with code that works most of the
time.