:: Re: [DNG] Studying C as told. (For …
Page principale
Supprimer ce message
Répondre à ce message
Auteur: Simon Hobson
Date:  
À: dng@lists.dyne.org
Sujet: Re: [DNG] Studying C as told. (For help)
Rainer Weikusat <rweikusat@???> wrote:

> Lastly, if the target system is Linux (as here), one can safely assume
> that EBCDIC won't be used.
>
> None of this matter anyhow for solving algorithm exercises in an
> entry-level book about a programming language.


On the other hand, it might just give a newbie to C (and others) some hints that might save their bacon (or that of others) later. When I first learned any programming, we too got to work on the basis that "letters are A-Z and a-z" - which as has been pointed out breaks quite convincingly once you step outside the "basic ASCII" character set. I would imagine that such hardcoded assumptions have been behind many a problem when internationalising code.


On 21 Jun 2016, at 18:50, Irrwahn <irrwahn@???> wrote:

>> An interesting task would be to look at the various algorithms offered, work out how the compiler is likely to handle them when turning into code (or even look at the code generated), and then work out the efficiency of the different methods in terms of CPU cycles/character.
>> Of course, it's possible that different methods may come out best depending on the amount of whitespace to be removed.
>
> And results will differ depending on platform, toolchain,
> compiler version, optimizer setting, and so on.
>
> Another aspect beyond just runtime efficacy is this:
>
> Source code is written for humans, not machines! Clarity and
> simplicity can help minimize code maintenance cost, and thus
> easily outweigh some slight runtime penalty. Whoever has found
> himself in a situation where he had to spend a considerable
> amount of time trying to grok what some "clever", "manually
> optimized" code is supposed to do, knows what I'm talking about.


I have no argument with that, or the rest of your response. However, I do think it is important for a programmer working in a high level language to have some concept of how changing (sometimes subtle) in code can impact on performance. I wasn't suggesting manually optimising code - but looking at how different algorithms and code arrangements impact how the end result runs.
Having no idea at all - or worse, not giving a s**t - leads to the scourge of "performance by throwing hardware at it".

I suspect we've all conversed with people who have approximately zero knowledge or interest in how "the greasy bits" of the machine ends up running their code. My first computer came with just 1k of RAM, and sockets for just 8k total. It's surprising what you can do with that !