[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [Elist Home]
Subject: [humanmarkup-comment] on doing research with SLIP
New tutorial http://www.ontologystream.com/journal/ACTutorial.htm Please call me if you wish to have me on the phone while you work though this tutorial (15mins) We have a Journal at: http://www.ontologystream.com/journal/JournaleventChemistry.htm Please consider looking at this link analysis type research tool and developing a contribution to the Journal. What we are in particularly interested in is the development of a markup language that corresponds to the human side of the state - gesture interaction and that can be used to reduce the variability of a voice command interpreter. What grammar elements are needed to allow a human voice to completely control the development and use of visual abstractions. Of course such a theory of state-gesture (as a controller of the SLIP produced visual abstractions) will also work in context not related to voice control. An example of such a mark-up language exists at: http://www.bcngroup.org/area3/gtonfoni/tonfoni.htm Dr. Paul Prueitt Founder (1997) BCNGroup
[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [Elist Home]
Powered by eList eXpress LLC