Le 12/08/2024 à 00:18, Steve Litt a écrit : > Didier Kryn said on Sun, 11 Aug 2024 13:57:31 +0200
>
>> Le 11/08/2024 à 04:06, Steve Litt a écrit :
>>> I think it's a pretty darn good language. It makes sense, its syntax
>>> is pretty guessable (except for pointers to functions), and it's a
>>> small language that's easily learned.
>>>
>> Yes, you think you learned it... but sometime you discover you
>> missed something. And there's a lot one can miss in C. There's a lot
>> of subtelties, like in automatic type promotion,
> You learn that, and personally I always gcc -Wall .
>
>> for example, like the
>> difference between static and automatic variables,
> Everyone knows that, and uses it a lot.
>
>> static constants
>> and automatic constants,
> I'll admit that the preceding is tricky when inside the parentheses of
> a function declaration.
>
>> embedded functions...
> I'll admit I've never done embedded programming, so I don't know about
> embedded functions. If you happen to mean "nested functions", why bother?
>
>> And this horrible thing
>> of using the equal sign for assignment:
>>
>> When you write
>>
>> x = 1;
>>
>> ...
>>
>> x = 2;
>>
>> Any one having a notion of mathematics will call you a liar.
> Yes, this took me 10 minutes to assimilate when I learned Fortran in
> school.
>
>> And
>> it
>> forces you to use == to mean equality, which has been the source of
>> zillions of bugs, because, in C, an assignment is an expression.
> if(x = 5) throws a warning, so if you use -Wall it's not a problem.
>
>> In any human language, people count things by giving number 1 to
>> the first and number n to the nth. In C the first has number 0 and the
>> nth has number n-1. Just for the sake of pointer arithmetics.
> A lot of languages have that same (mis)feature. You get used to it.
> Hey, I liked Turbo Pascal 3.0 even more than C, and TP had x:=5 for
> assignment and started counting at 1, and didn't have pointers. I wish
> Turbo Pascal had taken over the world, but C is a good second choice.
>
>> The enumerations share one single namespace, which is the same as
>> the variable names.
>>
>> In C, INT_MAX+1 == -1 ( INT_MAX is defined in <limits.h>)
>>
>> My PoV on C is that it is just the layer above assembler,
> X-actly! And when you need just the layer above assembler, I use C.
> Otherwise I use Python.
>
>> in
>> addition to be universal. And, for this it is pretty good. It is good
>> for little things, but great things, like, eg, the Linux kernel or the
>> GTK library are real achievements of expert programmers. Yet I'm still
>> programming in C because of this: every programmer roughly understands
>> it. But after 44 years of using this language, I still learned
>> something 2 weks ago and something else yesterday -- it's not so often
>> though (~:
> Well, you don't have Dunning-Kruger syndrome. You probably learn new
> things about every subject on a regular basis. At least with C, you
> learned enough of the language to be productive quickly, assuming you
> used -Wall.
> Like for many here, when I first started programming in C, it was a
great breath of air, after FORTRAN and PDP15 assembler. But it was 44
years ago. By that time I had already forgotten the very first language
I had learnt, which was ALGOL68 which I wasn't yet able to correctly
value. Later I discovered that even IBM-370 assembler enabled structures
and dereferencing, which FORTRAN dramatically lacked. I bet FORTRAN has
evolved since that time. Also C. I've noticed that C provides a newer
and safer syntax to assign values to structure and array elements.
I could continue because I have more to tell on how the C language
could be made safer, but I think I've made my point: this language is
easy to learn but it's easy also to make errors with it. And errors wich
don't necessarily crash but silently produce erroneous results or effects.
I loved C and I built several data aquisition applications with it.
When I faced a newer and more complex problem, I knew it would be
impossible in C, notably because of unavoidable bugs. I looked for an
alternative language and chose Ada. Three of us learnt this language and
I wished I would never write programs in another language. We produced a
very efficient and bug-free application which was multitask and
multi-host (with two different architectures and endiannesses).
Now I'm back to little applications written in C because of the
value of a widely used language. But I know I must be take great care.