:: Re: [Bricolabs] texts: IOT et al...
Startseite
Nachricht löschen
Nachricht beantworten
Autor: august
Datum:  
To: Bricolabs
Betreff: Re: [Bricolabs] texts: IOT et al...

> August -- I definitely had Shannon, but hadn't added it to that library
> yet... I'll cover coding, to be sure, from linguistic, technical,
> control, and social points-of-view, though we don't have the time to go
> into it (it's a seminar versus studio course -- I'd like it better to be
> doing things as well as talking about things!) (I'm intrigued about your
> statement that the (function?) of coding-as-information is
> counter-intuitive/directly at odds with a normative definition of
> information -- can you expand on that?)
>


Basically, the layman definition of information tends to define it as
the communicable signal that rises above the noise (randomness) and
carries meaning.

According to Shannon's narrow definition, information is a _measure_ of
entropy. In other words, information is not the signal in a noisy
channel, but a measure of the signal's content. It's a quantity not a
thing. The more information in a signal means more randomness (noise)
in a signal. A signal with lots of redundancy has little information.

Shannon is also careful to unhinge "meaning" from his concept of
"information".

It's more complicated, subtle, and interesting than what I describe
above, but that's the gist. Most of our communication technologies
depend on these ideas.

The stuff on coding theorem and stochastic signals is what I find most
interesting. What makes a large part of his information theory work is
that most "human" signals (music, writing, etc.) are stochastic;
non-deterministic but statistically predictable.

One interesting thing regarding stochastic signals is that you can
remove parts of them and still send enough communication for it to be
"understood". Eg: I cn wrt t y wtht vwls nd y shld b ble t ndrstnd

zzzzt.

best -august.



--
http://aug.ment.org
GPG: 0A8D 2BC7 243D 57D0 469D 9736 C557 458F 003E 6952