:: Re: [DNG] C vs ADA : advice sought
Etusivu
Poista viesti
Vastaa
Lähettäjä: dvalin
Päiväys:  
Vastaanottaja: dng
Aihe: Re: [DNG] C vs ADA : advice sought
On 01.10.24 12:13, Dan Purgert via Dng wrote:

> On Sep 30, 2024, o1bigtenor via Dng wrote:


> > Greetings


> >


> > if I were to want to choose between learning C (and likely adding C++) and


> > learning ADA for programming microcontrollers and embedded systems


> > what - - - besides amount of usage would you use to advise me - - -


> > which should I learn (and why please (this is at least as important as


> > your choice!!!))?


>


> At the end of the day, you need to be able to read and understand the


> Minimum Working Example ("MWE") code in the datasheets. And, well,


> that's going to be C.


Seconded. Atmel (now Microchip) AVR Atmega datasheets include C and

assembler examples for config register setup, so it is essential to

understand one or both. Having to translate them to some fandangle, as

yet little understood, language abstraction is only a liability,

amplified at the bottom of the learning curve.

Bringing up a new embedded design requires correctly setting port pins

to input or output as needed, so you can blink that first LED to tell

that the main code loop is running. Once the on-board UART is configured

for baud rate, word length, and start & stop bits, you can insert

logging here and there in the code to report diagnostic bumpf for

display on a PC, given a serial to USB dongle. Runtime examination of a

specific variable or two can cut debugging of a problem from days to

minutes. Last century we used an In-circuit Emulator to physically trace

program execution. It was an expensive luxury, with its own learning

curve. Now I find that logging suffices. You'll have to write code for a

message buffer when serial logging, to avoid impacting real-time program

execution, but that's a good little exercise.

When I began my 30 years of embedded systems R&D career in 1978, it was

all assembler, as there were no compilers for 8 bit micros back then.

Old habits have me still writing interrupt handlers and hardware

register setup code in assembler. Setting specific bits in hardware

registers at fixed addresses does not benefit from language abstraction,

so the closer the language is to the hardware, the better.

Please give thought to how much maintainer support there is for compiler

front ends for fancy languages when the back end is an embedded target.

If you report an ADA bug in gcc, when compiling for an Atmega328P, the

response may well be "No-one uses ADA on 8-bit, surely?" There's been

several occasions when even C support for AVR has required exceptional

intervention by gcc and binutils maintainers, sometimes due to its

Harvard architecture. Going off-road is courageous.

Highly abstracted languages are fine for "cotton wool programming" on

*nix or other OS, seeing nothing but stdin, stdout, stderror, and

writing to files by pointing vaguely in their general direction. On "bare

iron", thin gloves which allow feel for the hardware are preferable, I'd

suggest. (But based only on half a century of programming experience.)

Even C thinking can occasionally be too abstracted, I find. The mushy

term "dereferencing" seems too clumsy for embedded programming, if a

clear understanding of what's really happening is to be had. The

hardware *does* use pointers. Indirect addressing uses the value in a

register to read or write an address in memory. That's what the term

"dereferencing" obfuscates. An array is in reality only a pointer to its

base address plus an offset to an element of it.

(On a small microcontroller, I even specify --nostartfiles for gcc, and

write my own crt0 - initialisation of .bss and .data sections, i.e.

zero-initialised and constant-initialised RAM. YMMV. I also add

-nostdlib, to reduce code size, when that suits. I don't know if an ADA

program can be trimmed to under 1 kB in size - 500 assembler

instructions, including hardware initialisation and all application code.)

When contemplating going to C++ on a microcontroller, please consider

the advice on marriage "If young, not yet. If old, then never."

Definitely, runtime instantiation of class instances is harmful to

realtime performance, and may exhaust the tiny amount of available RAM,

causing an instant crash. OK, an ATmega328P (Arduino Uno) has all of 2

kilobytes of RAM, but the ATtiny2313 has *none* - just the 32 8-bit

registers. I have programmed a 3-channel LED dimmer with serial

communications for remote control on that chip - but not in C++.

On a small microcontroller with stuff-all RAM and hard real-time

application requirements, I never use C's malloc. Its behaviour is

indeterministic - the time required to faff about finding a free memory

block big enough to accommodate the request is random and sizeable.

Accumulated fragmentation exacerbates the problem over time.

(Where a pool of message buffers was needed in a DIY protocol stack, I

wrote a "balloc", which allocated fixed size RAM blocks, so the first

one on the free list was always suitable, and fragmentation could not

occur. That provided quick response, determinism, and unlimited uptime.

On 8-bit microcontrollers, the KISS principle is key to survival. There

is no fat for surplus fancy stuff. But go to an ARM chip with reams of RAM

and program memory, and you can pack the piano in your luggage for the

trip. So, is the aim an ocean liner cruise, or wilderness canoeing?

Erik