OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

humanmarkup-comment message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [Elist Home]


Subject: RE: [humanmarkup-comment] Base Schema-channel


CHANNEL DESCRIPTION (Re channel vs. communicationChannel, sensoryChannel):

                                        c. S. Candelaria de Ram, 4June2002
--                                         "n1" but within in a sequence of
communiques, see below ;)
Contents:
DEPARTURE POINT
DISCUSSION:  A COMBINATION
QUESTIONS ARISING?
********* PROPOSABLE (comChannel)
        communicationChannel, <SENSORS-WHOSE^n>, positing:
        n1 Sender, nk Receiver, so that n>=2
        possibly n1=nk (for the case of talking to yourself); 
        also n1 and/or nk have the property of humanness;

        <SENSORS-WHOSE^n> is a finite sequence with n *connecting* members
some of which may be intermediates between n1 and nk,
        a SENSORS-WHOSE may characterize the resp. SENSOR_TYPE,
        but any SENSOR-WHOSE in the sequence may be uninstantiated at any
given moment [or stage in the use of the term definition].

        *********
USED IN MARKUP, ONE EXAMPLE, VERY ROUGH TAGGING OF AN EMAIL HEADER
FURTHER DEVELOPMENT (esp. SENSORS-WHOSE with extension, override)


DEPARTURE POINT

At 08:10 AM 04-06-2002 -0500, Len wrote:
>If the scope of HumanML is human communication, sensoryChannel 
>describes the means of a human receiving information.
>
>What is the purpose of communicationChannel?  What I wish to 
>avoid is opening a very very very large abstraction that 
>subsumes all manner of communication.
>
>Channel may be sufficient.
>
>len

DISCUSSION:  A COMBINATION

The modifiers would seem to be qualifiers and thus each a limiting descriptor.
        Compare:  2p.m. to 2p.m. Tuesday June 4
Of course, it would seem to be equivalent if we did:
        channel, communication
        channel, sensory
        channel, sensory, human
        channel, sensory, dog
        channel, sensory?, PDA

But a channel would seem to be between senders and receivers, as in channels
with endpoints, say:

        channel, human, dog     (talkin' to your dog)
        channel, dog, human     (sensibly talkin' to your dog ;)
        channel, human, computer, phoneline, computer, computer, phoneline,
human   (sending email)
        channel, human (+ composer; + tape-recording), human + human + ...
(singer before audience)
        channel, human + human + human +...,
phoneline+phoneline+phoneline+... +callcenter,
human + human + human + ... (conference call for HumanML meetings)

And that structure might have its advantages too.  Might be used nicely for HCI
characterizations (HumanComputerInteraction), which is a present-day human
communication activity.  

But in the formulation examples as shown, exact specs of the endpoints and
method(s) of transmission vary with given instances.  That being the case,
for HumanML we'd need a metalevel definition to cover all cases, something
on the order of

        channel, <SENSORS-WHOSE^n> , where:
        this <SENSORS-WHOSE^n> is supposed to be an ordered sequence of
conduit-enabling things, 
        n >= 1 because you need at least a sender, 
        a SENSORS-WHOSE expands to describe who is involved and how (incl.
maybe when and where) -- receivers may be unknown [until later], so in
practice the channel may be only partially specified at any given moment.
Note the potential complexities in branching as with things like the
conference call example,
or the recording of a performance for later re-play.  (Not to mention an
intermediate CGI script!)
        a SENSORS-WHOSE may characterize the resp. SENSOR_TYPE

With our focus on characterizing communlcation, though, we'd really mean to
exclude things without semantic content, I guess, like

        channel, human, rug     (electric shock)
so we might want specifically
        communicationChannel, <SENSORS-WHOSE^n>, n >=1 for use

This already sort of combines Rex's and Len's (and my earlier) points.

QUESTIONS ARISING?

Is it the case that, more generally, when we *in HumanML* combine
specifiors, the net combination is more restrictive, except that we may use
negatives, and so provide exceptions as part of a specification?  

A python example just for fun:

Python 2.2.1c2 (#33, Mar 26 2002, 13:04:18) [MSC 32 bit (Intel)] on win32
Type "copyright", "credits" or "license" for more information.
IDLE 0.8 -- press F1 for help
>>> x == 3
Traceback (most recent call last):
  File "<pyshell#6>", line 1, in ?
    x == 3
NameError: name 'x' is not defined
>>> try:
	if not x == 3:
		print "concrete"
except:
	print "undefined"

	
undefined
>>> x
Traceback (most recent call last):
  File "<pyshell#9>", line 1, in ?
    x
NameError: name 'x' is not defined
>>> try:
	if not x == 3:
		print "concrete"
except NameError:
	print "x is recognized as being absent"

except:
	print "undefined"

	
x is recognized as being absent
>>> 

I don't know whether exceptions are always re-generalization or not.  Too
many negatives to contemplate today.

OOps -- that "recognized absence" bit would mean we would need to add to
communicationChannel that an audience is expected sometime -- after all,
that is the nature of COmmunication.  So we take one more step to:

********* PROPOSABLE

        communicationChannel, <SENSORS-WHOSE^n>, positing:
        n1 Sender, nk Receiver, so that n>=2
        possibly n1=nk (for the case of talking to yourself); 
        also n1 and/or nk have the property of humanness;

        <SENSORS-WHOSE^n> is a finite sequence with n *connecting* members
some of which may be intermediates between n1 and nk,
        a SENSORS-WHOSE may characterize the resp. SENSOR_TYPE,
        but any SENSOR-WHOSE in the sequence may be uninstantiated at any
given moment [or stage in the use of the term definition].


or so.  This is set up to allow for listening for unidentified senders as
well as sending to unidentified receivers.  The implicit time-bound nature
of communication is embedded in the SENSORS-WHOSE instantiations.

USED IN MARKUP, ONE EXAMPLE, VERY ROUGH TAGGING OF AN EMAIL HEADER:

Presumably, it could be used in markup in a number of ways....  Let's try it
out for a straightforward case, the headers of the email this is a direct
response to.  I've indented the tags for ease of reading, and abbreviated
corresponding closing tags with </...>.  Interpolated comments are shown as
[[COMMENT:  comment]] and leaps have ? as in COMMUNIQUE LABELS?? meaning
do-we-want-this-or-something-like-it?  Seems like 2 labels for our tags have
been proposed, HumanML and HUML- but some of these things seem to be tuples
rather than single words, so for now I've put <HUML, ....> .


<HUML, COMMUNIQUE LABELS??, global SENSOR_TYPE email>
                [[COMMENT:  I don't know if we can Have globals. I guess in
HTML we do, in the header at least, so yes.]]
                [[COMMENT:  The sender and receiver (so-called n1, nk) are
not of email type, though, they are more like HUMANS_WITH_EMAIL_EQUIPMENT!]]

------ begin example

Reply-to: 
        <HUML, CONNECTION, from receiver to potential response:  
        <HUML, comChannel, NEXT nk, reference MESSAGE-REFERENCE-ID:>
<clbullar@ingr.com>
        </...>
Delivered-To: <HUML, comChannel, nk-1 >
HHHHHHHH@klaatu.zianet.com 
        </...>
                [[COMMENT:  sender#k-1 inferrable]]                

Received: 
        <HUML, sensor-whose, receiver#nk-2 > 
(qmail 3951 invoked by alias); 4 Jun 2002 13:10:55 -0000
        </...>
                [[COMMENT:  sender#k-3 inferrable]]       
Delivered-To: 
        <HUML, sensor-whose, receiver#nk-3 >
alias-filterme-HHHHHHHH@zianet.com
        </...>
Received: 
        <HUML, sensor-whose, receiver#nk-4> 
(qmail 3942 invoked by uid 0); 4 Jun 2002 13:10:54 -0000
        </...>
Received: 
        <HUML, sensor-whose, sender#nk-5>
from ckpnt02.intergraph.com (HELO hq15.pcmail.ingr.com) (63.75.137.129)
        </...>
        <HUML, sensor-whose, receiver#nk-5>
  by zianet.com with SMTP; 4 Jun 2002 13:10:54 -0000

        <HUML, sensor-whose, receiver#nk-6>
Received: by hq15.pcmail.ingr.com with Internet Mail Service (5.5.2653.19)
        </...>
        <HUML, MESSAGE ID>
	id <MBX######>; 
        </...>
        <HUML, CLOCKTIME>
Tue, 4 Jun 2002 08:16:03 -0500
        </...>
        <HUML, MESSAGE ID>
Message-ID: <2C61CCE8A870D211A523080009B94E430752B51B@HQ5>
        </...>

From: 
        <HUML, SENDER, n1+1: HUML, PERSON_NAME: LAST_NAME , FIRST_NAME
MIDDLE_NAME NICK_NAME EMAIL_NAME @ EMAIL_ADDRESS>
"Bullard, Claude L (Len)" <clbullar@ingr.com>
        </...>
                [[COMMENT:  probably parser output]]
To: 
        <HUML, RECEIVER, nk:  HUML, PERSON_NAME: FIRST_NAME, LAST_NAME,
EMAIL_NAME, EMAIL_ADDRESS>
        </...>
'Rex Brooks' <rexb@starbourne.com>, 
        <HUML, RECEIVER, nk:  HUML, EMAIL_NAME, EMAIL_ADDRESS>
        </...>
HHHHHHHH@zianet.com, 
        </...>
        <HUML, RECEIVER, nk:  HUML, PERSON_NAME??: EMAIL_NAME, EMAIL_ADDRESS>
	humanmarkup-comment@lists.oasis-open.org
        </...>

        <HUML, MESSAGE ID??>
Subject: RE: [humanmarkup-comment] Base Schema-channel
        </...>

Date: 
        <HUML, CLOCKTIME>
Tue, 4 Jun 2002 08:10:11 -0500 
        </...>

                [[COMMENT:  inference of relation between CLOCKTIME AND
MESSAGE TRANSMISSION/CHANNEL?]]

        <HUML, comChannel, sender n1, mode?>
X-Mailer: Internet Mail Service (5.5.2653.19)
        </...>

</HUML, COMMUNIQUE LABELS??>

                [[COMMENT:  If we have a convention that in the domain of
HumanML (HUML above) </> means use-the-most-recent TAG it could make things
a heck of a lot more readable!  Is that permissible? SC]]

------ end example



FURTHER DEVELOPMENT 

SENSOR-WHOSE and its requisite *connecting* may need further specification:
anybody?    

For channel characterization, there's
        medium, mode, ..., 

There's some stuff in semiotics about it, some in discourse theory, some in
information theory, some in anthro, probably some in your various
implementations like VRML....  

Anybody have a good working set? 

Len meantime has mentioned SMIL term inventory and Rex that of
>the EMOTE program 
>from The University of Pennsylvania's Center for Human Modeling and 
>Simulation.

Also  Len has cited as primitives:

sight
hearing
touch
taste
smell

to which Rob has added 

kinesthetics

which Len has nicely brought in a specific description of:

>That is "the sense of where body parts are relative to each other" or 
>
>"the ability to feel movements of the limbs and body"?
>
>http://onlinedictionary.datasegment.com/word/kinesthesia
>
>kinesthesia n 1: the perception of body position and movement and muscular
tensions etc [syn: kinaesthesia, feeling of movement] 2: the ability to feel
movements of the limbs and body [syn: kinesthesis, kinaesthesis,
kinaesthesia, kinesthetics, muscle sense , sense of movement] [ant:
kinanesthesia] 

To this Rob quickly adds that:

> The "perception" (feeling) of where the body parts are, is the result of
the signals received from Muscle Spindles and Golgi tendon organs run
through an "internal model" of our bodies structure that has been built up
over time.  This "model"  changes over time as our bodies change, so it is a
dynamic process.

> kinesthetic: the understanding of the
>internal state of the body -- where one's body parts are relative to each
>other and gravity (or other forces), e.g., joint angles, proximities,
>orientation.  Touch includes external perceptions such as contact, pressure,
>and temperature; kinesthetics can also include internal attributes such as
>aches, pain, discomfort, pressure, soreness, etc.

One might also contend that  in the plural (sender(s), receiver(s)) this gets
us  proxemics:

kinesthetics of agent1 x kinesthetics of agent 2 --> proxemics
-- first, in geospace, by extension in cognitive  space.


Experimental Psychology gets you quickly beyond gross sensors in other ways
as well ...  Sight, for instance,  expands to grays, 3 colors,  to edging
and movement detection esp. peripheral within scope of momentary retinal
field, to symmetry with remembered images, face recognition specialization,
apparently with built-in sensitivity to smiles,  to mouth position -- even
infants under 36 hours respond to o, eeee, aah positions with
discrimination, ....  This brings us quickly to Len's

>  Speech, hand gestures formal), 
>postures or body language (informal) whistling, singing, 
>etc. are all kinds of communication, but are they  
>channels per se?  

Yes, with cultural overtones.  And for humans, cultural overtones cannot be
avoided during interpretation.



Furthermore, as people are pointing out,  communication is a concrete and
temporal process:

Len: 
>I set abstract to false assuming this is an element type that 
>is instantiable.

So all communication Events become INSTANTIABLE and non-abstract, apparently.

>Fair enough.   So an element type for that would not be abstract  
>and it will be a complex type. 
>
>So how to define that?  The properties might include a body part 
>name, a named body part location, a movement, and the tension values 
>of musculature.  In other words, kinesthesia is a name for a kind 
>of experience, not just a set of properties.  The model is as you 
>say, built up over time, so these are different for each person 
>and for each person given some chronemic values.
>
>How to account for dynamism?  One ends up creating a language 
>for it for which one might consider the other values we 
>are talking about as parameters to pass to the model.
>
>We have been postponing the process discussion.  Maybe 
>we have to bring it forward.

Whereby Paul Prueitt and Brian Newman's commentaries that there should be
related variations in semantics and the respective ontology..."a stratified
complexity", citing Pribram's 1991  "Brain and Perception"  and a view "
that all of the senses are mixed into an experience of world
that is unified."  In other words, communication is grounded to realworld
phenomena.  In other other words, HumanML has to deal with the semiotic,
or as Len says, "Words are not what they represent", the value can be
distinguished from the variable.  Therein lies our hope of success, perhaps!

>All of the sense associations are "intertwined" in some type of complex
>manifold, each sensory input (or group of sensory inputs) can act as a
>"decoding" key for activation of a dynamic memory recall.  This "decoding
>process" of associations can be viewed as an unfolding of related events that
>generates a type of trajectory (that branches) through "information" space
with
>something akin to a type of "momentum".  That is why it is often very hard to
>change our minds about things.
>
>Pribram's perspective on this is reflected in his description of "The
>Holoscape".
>
>I like Poincare's insight that;
>
>"Objects are not fleeting and fugitive appearances, because they are not only
>groups of sensations, but groups cemented by a constant bond.  It is this bond
>alone, which is the object in itself, and this bond is a relation."
>
>However, I think that his "constant bond" is a little more dynamic then he
might
>at first believe.
>
>The "senses" and our "perceptions" that arise out of them, functioning in a
type
>of feedback loop which modulate our experience of "a" world or world(s), and as
>such must be viewed as being unified in a real sense with that of the "world"
>model that we've created over our life time of thought and experience.  It is
>this "reinforcement" of previously experienced relationships that can be viewed
>as trajectory (with something akin to momentum) through what we call an
>"information/experience space".
>
>

Thus -- we must provide for dynamism of our term inventory, as well as some
capacity for culturally differing terms (individualization for any size
group), right?  This is a meta-Huml feature, but could simplify settling on
terms for the nonce:

*********
Suppose we have an extension method (not necessarily programming mechanism,
just definitional method) for both extension and [restricted?] override....
***********

Then we can, as Len urges:
>We have to break a domain down to schematize it.  Emergent properties 
>arise out of controls over engaging forces, 

so that even if as Rob notes, they:

>"overlap" with each other in a somewhat fuzzy way

and as Len adds, the need to provide for variability in

> point of view of the analyst and the question itself.,

is taken care of:  We sure havegeneral  consensus on the problem now!

SC


;)
SC


>
>-----Original Message-----
>From: Rex Brooks [mailto:rexb@starbourne.com]
>
>
>a sensoryChannel would be a conduit for input information into a 
>human object, i.e. an instantiation of the human element
>
>a communicationChannel would be a conduit of message-bearing energy
>
>a signal would be message-bearing energy (which we will still revisit 
>in order when we get there, realising that it may be further refined 
>by that time.)
>
>While it would be possible to derive these from channel as it is 
>written in the straw man, I think it would necessitate a third level 
>of abstraction as a secondary base schema, so to speak, so what I 
>propose is that we take the time to define some basic, if derived, 
>elements to avoid a secondary base schema just for these top level 
>derivations. I do think that these distinctions will turn up for many 
>of our singular base elements.
>
>
>
>----------------------------------------------------------------
>To subscribe or unsubscribe from this elist use the subscription
>manager: <http://lists.oasis-open.org/ob/adm.pl>
>
>



[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [Elist Home]


Powered by eList eXpress LLC