:: Re: [DNG] Re (2): Modula 3 is not O…
Forside
Slet denne besked
Besvar denne besked
Skribent: o1bigtenor
Dato:  
CC: dng
Emne: Re: [DNG] Re (2): Modula 3 is not Oberon
On Sat, Oct 5, 2024 at 7:58 AM Steve Litt <slitt@???> wrote:
>
> o1bigtenor via Dng said on Tue, 1 Oct 2024 18:03:50 -0500
>
> >On Tue, Oct 1, 2024 at 5:24 PM Peter via Dng <dng@???>
> >wrote:
> >
> >> From: nick <nick@???>
> >> Date: Wed, 2 Oct 2024 01:35:20 +1000
> >> > ... was a shame they reduced the set datatype to something so
> >> > generic as to be useless.
> >>
> >> Discussed in the mailing list a few years back. I don't know the
> >> topic well enough to explain but the reasoning made perfect sense at
> >> the time. If I can find a citation to the list archive, will forward
> >> to you. Or you can go to the list and raise the topic again.
> >>
> >> Yes, learn both Ada and C; and any other language you have time for.
> >>
> >>
> >Sadly the days when I could work for 16 ro 18 or even 20 hours or more
> >and be sharp
> >the whole time have fallen out of vision in the distance in the rear
> >view mirror.
>
> Then use ChatGPT to do the initial research for you. Here's what
> ChatGPT says about the question "":
>
> ====================================================
> Ada is a high-level programming language that was designed for
> developing reliable and efficient software, particularly in systems
> where safety and security are critical. Here are some key points about
> Ada:
>
> Origins
> Developed in the 1970s: Ada was commissioned by the United States
> Department of Defense (DoD) in the late 1970s as a response to the need
> for a standardized language that could improve the quality of software
> in defense applications.
>
> Named After Ada Lovelace: The language is
> named after Ada Lovelace, who is often regarded as the first computer
> programmer.
>
> Features
> Strong Typing: Ada is known for its strong typing
> system, which helps catch errors at compile time rather than at
> runtime.
>
> Modularity: It supports modular programming through packages,
> which promote code reuse and organization.
>
> Concurrency: Ada has
> built-in support for concurrent programming, making it suitable for
> real-time systems where multiple tasks need to run simultaneously.
>
> Exception Handling: The language includes robust exception handling
> features, which help manage errors gracefully.
>
> Applications
> Safety-Critical Systems: Ada is widely used in aerospace, military, and
> transportation industries, where safety and reliability are paramount.
>
> Real-Time Systems: Its concurrency features make it a good choice for
> real-time applications, such as embedded systems.
>
> Standards Ada 83, Ada
> 95, Ada 2005, Ada 2012: The language has undergone several revisions,
> with updates that enhance its features and usability while maintaining
> backward compatibility.
>
> Compilers and Tooling Open Source and
> Commercial Options: There are both open-source and commercial Ada
> compilers available, such as GNAT, which is part of the GNU Compiler
> Collection (GCC).
>
> Learning Curve
> Complexity: While Ada's features
> provide powerful capabilities, they can also lead to a steeper learning
> curve compared to simpler languages.
>
> Overall, Ada is a powerful
> language that emphasizes reliability and maintainability, making it a
> strong choice for critical software systems.
> =====================================================================
>
> From there you just keep refining questions. You can have a working
> Hello World of any language in an hour or less with the help of ChatGPT.
>

In another world - - - - possibly - - - here because I do not have
reasonable stupid phone reception
I cannot sign up for the wonders of Chatski.

Then as I spent some time pondering the AI conundrum I came up with what AI is!
AI is the normalization of thought - - - if all thought that is some
perhaps at max 1/2 a
standard deviation from the mean. That coupled with the fact that all
the most interesting
discoveries of the last quite some time have been from dedicated often
brilliant 'individuals'
with occasionally very small teams (some modern stuff takes scientists
with different specializations
to put things together - - - - but as far as I can tell never has a
committee ever really developed
something widely different or seriously ground breaking. I am really
starting to thing that the
push for AI is to keep chipzilla churning out the beaucoup de bucks
for another 15 years.
(Its the old adage 'follow the money'.)

So although your suggestion is not a 'bad' one - - - I won't be taking
advantage of this one.

Thanks for the idea(s).