:: Re: [Bricolabs] texts: IOT et al...
トップ ページ
このメッセージを削除
このメッセージに返信
著者: Armin Medosch
日付:  
To: Bricolabs
題目: Re: [Bricolabs] texts: IOT et al...
Hi John,

my 2005 MA thesis on Technological Determinism in Media Art should still
be enlightening reading http://www.thenextlayer.org/node/1214

For the rough and ready, 45 RPM has some condensed presentations of
issues http://www.thenextlayer.org/node/1192

My PhD dissertation on New Tendencies remains unpublished. However, I am
happy to do lectures on that all over the world and hopefully a
publisher can be found to turn it into a book ... some issues are
presented here http://www.thenextlayer.org/node/1365

cheers
Armin


On 08/16/2012 06:36 AM, Tapio Makela wrote:
> Hi John,
>
> Is Boulder still a mixture of super sporty shiny people blended with hippie nostalgia? :)
>
> For the course, this essay I wrote back in 2000 could be of some use:
> "Re-reading digitality through scientific discourses of cybernetics: Fantasies of disembodied users and embodied computers"
> http://www.hum.utu.fi/oppiaineet/mediatutkimus/tutkimus/proceedings_pienennetty.pdf
>
> Wendy's work is also insightful for seminars, maybe not introductory lecture series:
> http://www.brown.edu/Departments/MCM/people/facultypage.php?id=10109
>
> In lecture series as an introduction to critical HCI tend to use this book by Paul Dourish:
> http://www.dourish.com/embodied/
> (though some of the examples there are somewhat outdated, it works very well in teaching).
>
> Kathy Hayles in my opinion is 70% brilliant, yet where she enters posthumanistic discourse there is also plenty to be critical about.
> Same goes for Tiziana Terranova's work. Both are nevertheless good reads in seminar contexts.
>
> cheers!
>
> Tapio
>
>
>
> On Aug 16, 2012, at 07:20 , John Hopkins <jhopkins@???> wrote:
>
>> Hi August!
>>
>>> Basically, the layman definition of information tends to define it as
>>> the communicable signal that rises above the noise (randomness) and
>>> carries meaning.
>>>
>>> According to Shannon's narrow definition, information is a _measure_ of
>>> entropy. In other words, information is not the signal in a noisy
>>> channel, but a measure of the signal's content. It's a quantity not a
>>> thing. The more information in a signal means more randomness (noise)
>>> in a signal. A signal with lots of redundancy has little information.
>>
>> Ah, yeah, the entropy connection,
>>
>>> Shannon is also careful to unhinge "meaning" from his concept of
>>> "information".
>>>
>>> It's more complicated, subtle, and interesting than what I describe
>>> above, but that's the gist. Most of our communication technologies
>>> depend on these ideas.
>>>
>>> The stuff on coding theorem and stochastic signals is what I find most
>>> interesting. What makes a large part of his information theory work is
>>> that most "human" signals (music, writing, etc.) are stochastic;
>>> non-deterministic but statistically predictable.
>>>
>>> One interesting thing regarding stochastic signals is that you can
>>> remove parts of them and still send enough communication for it to be
>>> "understood". Eg: I cn wrt t y wtht vwls nd y shld b ble t ndrstnd
>> _______________________________________________
>> Brico mailing list
>> Website on http://www.bricolabs.net
>> Unsubscribe: http://lists.dyne.org/mailman/listinfo/brico
>>
>
> _______________________________________________
> Brico mailing list
> Website on http://www.bricolabs.net
> Unsubscribe: http://lists.dyne.org/mailman/listinfo/brico
>
>