Autor: John Hopkins Data: Para: Bricolabs Assunto: Re: [Bricolabs] texts: IOT et al...
Hi August!
> Basically, the layman definition of information tends to define it as
> the communicable signal that rises above the noise (randomness) and
> carries meaning.
>
> According to Shannon's narrow definition, information is a _measure_ of
> entropy. In other words, information is not the signal in a noisy
> channel, but a measure of the signal's content. It's a quantity not a
> thing. The more information in a signal means more randomness (noise)
> in a signal. A signal with lots of redundancy has little information.
Ah, yeah, the entropy connection,
> Shannon is also careful to unhinge "meaning" from his concept of
> "information".
>
> It's more complicated, subtle, and interesting than what I describe
> above, but that's the gist. Most of our communication technologies
> depend on these ideas.
>
> The stuff on coding theorem and stochastic signals is what I find most
> interesting. What makes a large part of his information theory work is
> that most "human" signals (music, writing, etc.) are stochastic;
> non-deterministic but statistically predictable.
>
> One interesting thing regarding stochastic signals is that you can
> remove parts of them and still send enough communication for it to be
> "understood". Eg: I cn wrt t y wtht vwls nd y shld b ble t ndrstnd