OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.


Help: OASIS Mailing Lists Help | MarkMail Help

huml message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]

Subject: Fwd: Grant information related to Gesture Recognition/TrainingSimulations

Title: Fwd: Grant information related to Gesture Recognition/Trai
I was working on grant proposals in my other incarnation in our sister organization, Humanmarkup.org, Inc. which also has a meeting on yahoo chat late tomorrow afternoon and thought I would forward this to both lists with thanks to Ranjeeth. These sources are valuable. I'm remembering why it is that writing grant proposals is no one's favorite activity, well almost no one's. I commend any who develop facility with this task. Narrowing focus is not an easy task for a generalist like me. So it is going slower just when it needs to go faster. (Why am I not surprised?)


X-Authenticated-User: rkthunga.interposting.com
From: "Ranjeeth Kumar Thunga" <rkthunga@interposting.com>
To: "'Rex Brooks'" <rexb@starbourne.com>
Subject: Grant information
Date: Fri, 21 May 2004 08:28:28 -0400
thread-index: AcQ/Lx0DFmW0TgltQKW79RsDGlenbA==
X-Rcpt-To: <rexb@starbourne.com>
X-DPOP: Version number supressed
Status: U
Hi Rex,
You've probably gotten a chance to do some of your own research (due to my delay in getting back with you), but I wanted to nonetheless provide you some of my thoughts on the different areas you provided. 
I found some references which I thought are related to the grant proposals and of use for the PI's, as well as assembled some potential thoughts for how HumanML could be applied in situations related to NSF grants that you outlined.  Please let me know if you'd like me to elaborate further on any of these areas, and if this is helpful.  I could elaborate as needed (I was pretty general in my explanations).
Talk soon.
Ranjeeth Kumar Thunga
Links related to Gesture Recognition:
Store and communicate data gathered between various input devices and output modalities for people (Rob/Jim's current work)
Store and catalog behavioral and physical qualities of individuals for further diagnosis, for purposes including establishment/validation of criteria for physical/psychological pathologies.
Tools for those with specific psychological and communication disabilities  (in addition to physical disabilities), through which individuals can extend the means of expression to communicate with others, through augmented communication systems.
Biofeedback learning: Change behavioral responses through various types of feedback mechanisms, allowing disabled individual to learn new behaviors over time and the computer to adapt feedback based on HumanML data.  Feedback can be provided through various modalities.
:Links related to Simulation and Training Environments:
            www.rti.org/avatalk/ (sounds familiarŠwe may have discussed this in our list?)
Can assist in simulating responses to different types of physical or psychological expressions of people.
Can help track and store and interpret behavioral responses by participants in order to determine response time/effectiveness of these different responses.
Can provide visual documentation of how to respond to different types of emergency scenarios.
Lesson Plans for dynamic models can be created to represent the specific physical attributes and disabilities of the individual onscreen, and appropriate behaviors targeted for that particular person.
Domains can include: medicine, law enforcement, foreign-assistance/refugee operations, and military operations

Rex Brooks
GeoAddress: 1361-A Addison, Berkeley, CA, 94702 USA, Earth
W3Address: http://www.starbourne.com
Email: rexb@starbourne.com
Tel: 510-849-2309
Fax: By Request

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]