Autor: Enrico Weigelt, metux IT consult Data: A: dng Assumpte: Re: [DNG] Politics of IT in the U.S. government
On 05.08.2016 01:18, Steve Litt wrote:
> The kinds of things that eliminate conditionals can also be done by C
> structs, Perl hashs, Python dicts, etc. You could, if you wanted to,
> even pin function references to a struct, hash or dict to avoid OOP.
The interesting - maybe philosophical - question here is whether you're
actually doing OOP here, just w/o using OOP language constructs.
We could claim the Linux kernel is heavily OOP'ed. (and even the Unix
philosophy of "everything's a file" indeed is OOP)
> I don't usually do that: I use OOP in when different functions are
> used for different situations, but you could.
When programming in C(++), I try to avoid C++ constructs, to get around
the costs, in general try to KISS and prefer thinking twice before
coding anything down. OTOH, whever I get in touch w/ c++ projects,
they're usually fat and bloated and hard to maintain.
> I sometimes go OOP, and sometimes not. And some of my programs are
> basically structured programming with some objects thrown in. I think
> one would need to be nuts not to represent a data store as an object.
Why so ? Especially PDOs usually dont need much more than a struct.
The whole idea of objects - in contrast to an struct - is that they
have their own logic, doing things on their own. PDOs IMHO are only
a workaround for languages where everything's an object.
> Having an application as part of a window object --- not a fan.
ACK. But that kind of misdesign isn't limited to OOP.
> Expectations: 1990 - 2005 OOP was expected to be the savior.
Well, saving from what exactly ?
> It would cure all our coding ills. Coding factories. Less errors.
> Useable by less skilled programmers. For the most part, that never
> happened.
Ah, here it comes: saving us from stupidity. Actually, most programmes
(I dont call them sw engineers) that insist in C++ or Java, indeed are
stupid. Sad but true. At least that's my observation in the last two
decades (yes, I've often enough been stupid myself).
So, that can't work. Technological solutions for social problems
wont work.
> Expectations (again): They said OOP must be all or nothing. Smalltalk
> good, Perl bad. No fair using structured code where it works best, and
> objects where they work best. Nope, if you're not 100% OOP, you're just
> a hack.
Yeah, ideologies taking precedence over rationality. Just like systemd.
(hmm, why isn't systemd written in c++ w/ boost ? perhaps we should
infiltrate them and seed that idea ;-))
> LOL, Java required the main routine to be in a class: I'm forever
> seeing main classes that obviously are kludges for (disallowed)
> structured code.
Well, Java wasn't made for things that have something like a
'main program' - there are only objects or groups of objects which
live mostly on their own. That 'main object' here is just the little
glue to the 'old way'. In a 'perfect' Java environment, yo dont have
such things (except for the runtime system itself). OTOH, one could
claim that C suffers the same problem, as it needs an special main()
function. And both can be abused the same way.
The problem, IMHO, is that the JRE traditionally comes as a big blob,
which creates its own world inside a black hole, instead of a set of
libraries and some small command line utils. Anyways, we can live
w/o an JRE, using gcj.
> Often misused: How many hundreds of times have you seen somebody
> willy-nilly substitute a "has-a" relationship for an "is-a"
> relationship, and vice versa.
Oh, got that in the vast majority of OOP projects. My personal
favourite: deriving business logic from some thread class, just because
it's running things in an own thread.
Usually, those decisions are done by people who spend most of the
time w/ painting fancy (but quite useless) UML diagrams, before the
first LOC is written.
Again, the actual problem lies somewhere else, has nothing to do
w/ programming languages or paradigms, etc.
> How often have you seen completely
> unrelated things thrown into a class because "it seemed like a good
> idea at the time"? How many times have you seen code tracing turn into
> a volleyball game, where logic bounces repeatedly between three or four
> objects? How many classes have you seen whose sole purpose was to take
> data from one object to another? Dbus, anyone?
Yeah, the usual insanity.
But wait, there's even more: how often have you seen tons of
"abstraction layers" for simple OS functionality ?
For example, in a recent project, they even added C++ wrappers for
trivial things like sprintf() (yes! they didn't use snprintf()).
Or another example, where the jerks (a medical device, which uses
microcontrollers as well as an arm-based linux box) wrote a lot of
things which we already find either any decent OS or standard libraries
on their own. On the microcontroller side that still could be accepted,
but then they moved their whole stack ontop of GNU/Linux, just to
"reuse" as much as possible - it didn't even solved any problem,
which wasn't already solved by any existing standard mechanism.
Some day, they even admitted the reason behind: they just dont
understand the Linux world (hmm, I'll yet have to check back whether
they meanwhile understood what the different clocks are for ...)
Again: there's not technical solution for stupidity.