OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

humanmarkup-comment message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [Elist Home]


Subject: RE: [humanmarkup-comment] Base Schema-haptic


Yes. I'm just looking to tease out those relationships and the 
situatedness you mention, and which David Dodds is working on, too. 
Haptics per se is just touching behavior, but we have to make sure we 
demarcate where touching and sensing diverge, while recognizing that 
in some circumstances they merge.

Not an easy distinction when sending and receiving get that close together.

Ciao,
Rex

At 9:02 AM -0500 8/23/02, Bullard, Claude L (Len) wrote:
>Haptics among other things force out the need to identify
>how to denote co-occurrence constraints.   For example, when
>the culture is Borg, the haptic values fall into say
>some range depending on the other relationships in effect
>at the time and place and objects present (the context).
>This aspect of situatedness that the KR folks talk about
>dominates the design of a description of a human
>communication.
>
>That is why as I have been working the semiote requirements,
>I have found myself focusing more on individual selector
>types that have to be active in order for a semiote to
>choose among signs both to perceive (what we don't
>believe or understand, we don't note), and to send
>(our emotional values weight our selections even
>when logic says otherwise).  These selectors are
>activated by among other things, the **proximity** of
>other objects in the environment and the signs they
>can receive and emit, and will receive and emit.
>It begins to look very object-oriented in implementation,
>but a VRMLie usually understands how this is modeled
>in the coordinate space, and an AI guy knows how to
>represent the relationships in db objects that feed
>that real time engine.
>
>Seeing that early, I left a lot of the primaries
>abstract because I think we have to have some simple
>and reasonably uniform/universal definitions up
>there.  It is as said, a weak ontology because
>a strong one would quickly lead to fractures.
>Many object designs have weak abstract classes. 
>
>You are right, but haptic is just touching behavior.
>The rest has to be done in combinations of co-ocurring
>ranges of values of the other elements.  The combinations
>don't occur in the primary, but in the secondaries. Yes?
>
>len
>
>
>From: Rex Brooks [mailto:rexb@starbourne.com]
>
>haptic
>
>The element is a ComplexType and abstract . It does not reference
>other elements and is not used by other elements and belongs to the
>attribute group humlIentifierAtts.
>
>And that is where the easy categorization ends. I've been thinking
>about this element for a while now, and that is the reason why it ha
>taken me a while to get to it, rather than our other scheduled work,
>i.e. our most recent meeting. Rather than enumerate all the info in
>in the description which is fairly lengthy, I will have to ask that
>it be read because I want to suggest that we give this element quite
>a bit of thought. I do not in any way disagree with what Len has done
>in the straw man, and I will gladly accept that as the most useful
>definition of the term, but I think we need to consider it more
>deeply, especially as it relates to elements to come such as kinesic
>and proxemic.
>
>There are at least three areas that come under haptic.
>
>1. Touching behaviors involve the intimate, personal and social
>parameters which seem very much like they should either be their own
>elements as derivations or should be high level attributes, i.e.
>attributes of the Primary Base Element haptic itself and that would
>be the first time we would do that. I would prefer not to do that
>because I have always tried to avoid using attributes of elements
>wherever possible. They are messy and introduces a dose of
>computational complexity at a level in the processing of the XML
>schema that is a lot cleaner, faster, and easier without it. In
>addition, touching behaviors, as Len points out vary from culture to
>culture with differing rules for which body parts are used in various
>circumstances, as well as having great differences in the kinds of
>actions with which they are associated, such as hostility or
>affection.
>
>2. Sensory channel functions for which an entire scientific
>discipline and literature exists--the mouse being an example of a
>haptic feedback-control mechanism.
>
>3. Emotional Communication, while it involves or can involve both
>touching and sensory channels extends outward as the interface for a
>wide number of actions and reactions of a non-verbal nature which
>either communicate something completely without combining with any
>verbal communications such as a kiss leading to sexual arousal and
>complete sexual acts or pulling the trigger of a gun which may end
>another human being's life, but which may or may not be an emotional
>communication per se.
>
>I suggest we spend some effort teasing these aspect of haptic apart
>and look at them in light of how they will be included in the
>semiotic engine for processing communications and how they will set a
>precedent for the other elements which fall into the non-verbal
>communications areas.
>
>Please noe that I have not addressed the datatyping values because
>they are amply covered by what Len has already set down. I don't
>think we need to noodle that any further. We just need to come to
>grips with the range of aspects we are considering, which, as I said,
>could easily just be going with what we already have and leaving it
>at that. However, before we do that, I think we need to consider
>these other issues.


-- 
Rex Brooks
Starbourne Communications Design
1361-A Addison, Berkeley, CA 94702 *510-849-2309
http://www.starbourne.com * rexb@starbourne.com



[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [Elist Home]


Powered by eList eXpress LLC